This summer, Rutgers University announced that, for the first time in its history, the university raised more than $200 million in an academic year—11 percent higher than the previous record set in 2015. Rutgers’ record-breaking fundraising year was achieved, in part, because of a record 49,736 donors to the university. The Rutgers University Alumni Association’s (RUAA) vision of instituting an individualized alumni relations program was critical to this effort, but with 480,000+ alumni, it’s fair to question how such an approach was feasible—or even realistic.
The answer is that a strategic commitment to market research enabled the university to develop alumni programs that better met the needs and interests of graduates, ensuring relevance to Rutgers alumni, friends, and donors. The Rutgers University Foundation partnered with SimpsonScarborough on a comprehensive four-part research effort designed to strengthen alumni connectivity to the university and each other. After two quantitative studies, the RUAA used the data to make programmatic and communication shifts that aligned with alumni preferences and can be tailored for an individual approach. Here are just a few examples:
- Data showed that Rutgers alumni had strong preferences for specific days and times. The RUAA acted immediately to shift from early-week events to Thursday nights and weekends. The findings also identified specific cost thresholds, so the team now looks to provide more entry-level events.
- The research findings provided insights into desired event topics. The association developed an entirely new series, Alumni on Location, that combined many of the features toward which alumni responded favorably.
- Career development was identified as a missing benefit for Rutgers alumni. In response, the association is building out, in concert with University Career Services, a new Alumni-Career Development program. The program deliverables are also being built using the results from the research studies.
- The data was clear: Millennial and Gen-X alumni want to volunteer, and in very specific ways. Currently, the RUAA is working to develop opportunities for these cohorts that match the interests demonstrated in the research findings.
- The data showed that one of alums’ core expectations of the association is relevant and timely information-delivery. By investing in marketing automation resources, the organization is now positioned to better serve alumni in this priority area.
Rutgers continues to use market intelligence to fuel its momentum. The university is currently in the midst of a third quantitative study, this time focusing on the highest-rated engagement activities from the earlier research to better understand “first-time” and “regular” engagers, the value their relationship with Rutgers provides to their lives, and how those opportunities make them feel.
We are living in an age of data. And difference. Of popularity. And polarization. And, oftentimes, we’re presented with plenty of information but little clarity.
It’s a dilemma facing many higher education marketers as they deal with trying to develop strategies for an overwhelming number of audiences, communications channels, and various goals and objectives—not to mention a broad spectrum of campus stakeholders ranging from self-professed marketing experts to those who feel it devalues the purity and value of education.
Adding to the challenge is that brand strategy is both art and science. Data should inform your strategy. But the process of co-creating a strategy on campus is just as important. It requires collaboration, internal insight, and buy-in.
So where to begin in honing-in on the right direction?
At SimpsonScarborough, we believe that research is at the core of any meaningful brand strategy and creative. That an enduring brand is not the result of “my gut tells me,” but rather, of thoughtful, tested inputs. There are many ways for college and university marketers to collect those inputs. But they’re not all equal.
There’s been a tendency in higher education to point to broad generational data, and it’s become extremely popular and trendy of late. But while there are certainly commonalities and distinct differences among generations (e.g., Baby Boomers, Gen X, Millennials, Gen Z), the reality is that generational data looks for points of commonality among age groups of very wide distributions. Boomers are usually classified as those ages 51-69 while Gen Xers can be anywhere from 35-50. Millennials could have been born between 1982-2002 with Gen Z born after those dates.
Trend data—while interesting—might be a bit like relying on your horoscope. It is too vague or broad to mean much. And the distinct differences these reports tell you? Well, they’re not so exact either. Common myth—and many a report—would have you believe that those self-centered Millennials are job-hopping, looking for the next great gig. But, according to a recent Pew Study, the tenure of Millennials in their jobs isn’t any different than the Gen Xers of my age bracket. And those Gen Zers that dislike email? In mStoner’s annual study on “Mythbusting Enrollment Marketing,” email is the preferred way they want to hear from colleges and universities.
Even the recent (and troubling) studies on differences on the views of higher education between Republicans and Democrats (by Pew and Gallup) likely have a lot of variance based on geography, whether respondents attended college themselves, or whether they are currently paying to send their kids to school—factors that unfortunately didn’t make the 250-word news story covering the reports.
Trends are better used as starting points for discussions on your strategies, not the end point. Building your strategy based on this information will leave you with creative expressions that feel generic or non-distinct. That’s a risk that—especially in an industry already mocked for non-distinct taglines, creative, and visual styles—you can’t afford to take.
So what’s a college to do?
- First, find out what data already exists on your campus. Many schools subscribe to or collect institutional data regularly. This might include NSSE data on student experience or the Admitted Student Questionnaire (ASQ) administered through CollegeBoard.
- Look at things that are specifically about college or college-bound students. This might include the CIRP Freshman Survey, which annually studies incoming college students on trends dating back to 1973. Or Beloit’s annual mindset list of incoming students. While these kinds of omnibus studies are generally better for broader trends or for administrators concerned about delivery of academic or student experience, overall campus culture or other issues there could be interesting findings that support some of your brand focus.
- Analyze your own social and web data. The reality of our current digital worlds is that they are full of data on what content, experiences, and engagements your audiences expect. But this data must be managed and optimized with regularity to get the best out of it.
- Conduct your own research. The great thing about customized market research is that questions are specific to your institution. Conducting deep, focused research about your institution can dig at the personality, emotional connection and value/benefits specific to your brand experience. Differences between various audiences and secondary factors within those audiences can be analyzed to get to the insights that are distinctive to your school. Budgets and timing can be critical factors in deciding how much research you might do, but nothing can be a proxy for asking specifically about your college or university and its people.
Join us on Monday, November 13 from 10:15-11:00 a.m. at the AMA Symposium for the Marketing of Higher Education to learn how we take an audience-centered approach to getting insights about your schools in our session, “Power to the People: A Human-Centered Approach to Marketing Strategy.”
There’s a tendency to view creative—viewbooks, videos, advertising, or your website—as your institution’s brand. The truth is, no matter how great the creative might be, it isn’t your brand. Creative is simply the expression of a brand, the output of two equally important inputs—research and strategy. Unfortunately, more and more today, creative is nothing more than an expression of mutually agreed-upon opinions. That’s why so much is ultimately forgettable. And that is why I joined SimpsonScarborough several months ago.
Having worked at Notre Dame as the University’s only marketing writer, I know just how essential research is to creative work. Is it necessary to write a clever line or to design a beautiful layout? No. But for a brand to be something more than just clever and beautiful, for it to be truly authentic and unique, inspiring and enduring, thorough research and thoughtful strategy are absolute musts.
I eventually left campus and found myself at another agency focused on higher ed. I knew that we were often writing and designing first, and then forcing the research findings and strategy to align with our creative, rather than the other way around. We didn’t like to test creative because those people reviewing the creative wouldn’t offer useful insight. After all, they weren’t “creatives,” they were just our target audiences.
SimpsonScarborough’s two most recent creative hires share my belief—our belief—that research and strategy lead to impactful creative. They also have invaluable on-campus experience that informs their thinking and inspires their work. It is with great excitement that we introduce you to Jason Shough and Tyler Bergholz.
Jason Shough, a graduate of the Medill School at Northwestern University and The Ohio State University, is an award-winning copywriter. From 2012–2014, he was also a speechwriter for Dr. Gordon Gee. At Ohio State, his former boss used to say, “You need nerves like steel pipes to be a college president.” To Jason, the same rule applies to doing great brand work. Whether it’s research, strategy, or creative, you need guts to break through the noise — and Jason’s never backed down from that challenge. In the last five years, he’s written celebrated campaigns for Northwestern University, University of California, Monmouth College, Antioch College, Western State Colorado University, and Rotterdam School of Management. He’s also served as lead creative for several global pharmaceutical brands. Today, we welcome him as our Associate Creative Director, Copy.
Tyler Bergholz is our new Associate Creative Director, Design. A graduate of East Carolina University, Tyler is a phenomenal graphic designer and artist. He’s also a consummate pro who’s as much at home with scissors, a Sharpie, paper, and some glue as he is with Adobe Creative Suite. Imagine an old soul with access to today’s technology, and that’s Tyler. He possesses big-time agency, campus, and corporate experience and is a multiple CASE and Addy award-winning designer who has been featured in numerous publications, including Lurzers Archive Magazine and AIGA 50 Books/50 Covers. Tyler has designed for North Carolina State University, the University of California, The Andy Warhol Museum, The Roberto Clemente Museum, GNC, Conair, Cooper Tires, Ruby Tuesday, and Rite Aid.
Generally speaking, choosing a framework for developing a marketing budget is the ultimate exercise in navigating higher-ed politics and bureaucracy. And if you want (or more likely, need) to allocate by program, the task becomes that much more complex. There’s no perfect way to do it; as Bill Campbell, Vice President of Marketing & Communications at Chatham University, said in April of this year on a Higher Ed Live Podcast, no one model fits every institution.
Higher ed marketing communications offices have been using ROI measures more and more to inform marketing spend, especially with digital marketing analytics so readily available. In simple terms, ROI divides marketing and advertising costs by the total number of inquiries generated to determine cost per inquiry. If you’re trying to determine ROI by program, however, it’s not always a perfect science, since oftentimes programs don’t have dedicated communications budgets but instead share comingled funds with other programs.
So how can colleges and universities create a more comprehensive process for allocating marketing dollars to many programs that are often vastly different in size and scope? At SimpsonScarborough, we recently completed such a project for one institution’s adult and graduate programs. Here’s what we learned.
First, determine institutional priorities. Institutional prioritization is multifaceted and subjective, requiring important yet difficult discussions among senior leadership, department chairs, and the provost. During these discussions, some programs can and may receive additional marketing dollars if they:
- align well with the school’s mission or its current strategic plan;
- have deeper faculty expertise with a notable reputation and/or ranking;
- have adjunct or full-time professors available for increased course loads;
- require fewer resources and facilities to administer the program; and
- have higher program profitability (tuition cost less the cost to administer the program) desired for revenue generation.
Next, assess employment outlook. With the assistance of and approval from academic departments, each academic program is assigned an IPEDS program code, which can be linked to an occupation code within the Bureau of Labor Statistics’ (BLS) Occupational Outlook Handbook. The size of the overall job market, growth or decline in total projected jobs, and the total annual openings by program should inform marketing budget allocation. BLS data alone isn’t enough to inform the budget justifications by program, however, because the data is not real-time data. There are software interfaces available that collect and analyze tens of thousands of current job listings by education, skill, or area of expertise required, and then link that data to academic programs. Both the regional and national employment outlook should inform program marketing budgets.
Finally, track competitor program growth and decline. IPEDs data also can be used to determine market share per program and growth or decline in completions for competitive sets. Calculated projected growth can also inform a program’s potential based on patterns of previous growth. This analysis shows how crowded the “space” is by program and which institutions are dominating market share. Institutions may choose to increase budget allocations for programs where there is limited competition.
One important caveat about IPEDs data: It can misrepresent the marketplace due to inconsistent coding of programs by institutions. A representative from each academic program should review the competitor set, since the departments know their competitors best. Their knowledge helps to ensure the market share calculations are accurate based on a set of IPEDS codes that realistically represent the competitive landscape. The growth or decline of each program nationally is also interesting to consult given the mobility of graduates and the proliferation of online programs.
In closing, when trying to decide how to allocate marketing spend by program, remember there is no definitive formula or data set. However, the more information you can collect to augment the more traditional ROI calculations, the more effective your marketing budgets can be. If institutions inform their budget allocations in part by potential program growth data, there is likely an increased chance at seeing effective results. As always, the more marketing data collected and coded by program, the more optimal the results may be, creating a system of flexible and defensible allocations.
This year, we have been giving our readers an occasional glimpse into SimpsonScarborough’s normative database, which contains findings from hundreds of studies we’ve conducted over our past decade of research work. We use this data to give our clients some context for their own survey findings. (Caveat: We can only include data gathered on standard questions that are worded the same and include the same set of response options.)
Brand health is something that most schools are (or should) be tracking every few years. It helps measure and understand how audiences feel about your college or university. It starts with the institution conducting a foundational research study that provides baseline measures and then replicating the study every couple of years to understand if and how the needle has moved. It is extremely important to measure before and after a major marketing, branding, or advertising initiative if you want to understand the campaign’s impact on your key stakeholders. In this article, we will share findings on three brand-health metrics we have gathered from our work.
Unaided identification (or brand awareness). This is a simple way to measure the extent to which your institution is (or isn’t) top-of-mind in your market. A typical awareness question is, “When you think of excellent colleges/universities in [geographic area], which ones come to mind FIRST? We typically ask respondents to write in schools located within a certain geography; other times we ask them to indicate schools that are best known for certain things (liberal arts, STEM, Jesuit education, etc.). “Unaided” means that respondents are not provided with a list of schools to choose from, they are simply given blank lines and asked to write in up to five schools. This is a great tool to help measure the effectiveness and reach of your marketing efforts: Are audiences thinking of your institution or recognizing it more than before?
Below are the average percentages for how often each audience writes in the name of the client school.
- Current undergrads: 58% identify client
- Privates: 61%
- Publics: 54%
- Current grads: 51%
- Privates: 55%
- Publics: 43%
- Alumni: 51%
- No difference between public and private
- Faculty and staff: 34%
- No difference between public and private
Notice how faculty and staff are lower? We call that the “faculty dip.” Faculty members are always an institution’s most critical audience, and so it is never unusual for us to see their numbers much lower than those for other audiences.
Familiarity. In a brand survey, one of the first things we do is ask respondents to rate their level of familiarity with your institution. This is important because if they are NOT familiar with your school, they can’t answer questions about it in any informed way. Familiarity is also a foundational brand-health metric. For this question, we give respondents a small subset of peer/aspirant schools to rate so that we can see how their familiarity with your institution compares to that of other institutions in your competitive set.
We ask respondents to indicate whether they have 1) never heard of this school, 2) only know the name, 3) are somewhat familiar, or 4) are very familiar. Average total familiarity (somewhat or very familiar) with the client school is shown below:
- Prospective undergrads: 65% somewhat or very familiar with client institution
- Prospective grads: 85% somewhat or very familiar
Preference (also known as likelihood to consider). This is a question we typically ask of prospective students to understand how likely they are to consider applying to and/or attending both the client institution and a small subset of its close peer/aspirant schools (ideally the same list we used in the familiarity question). Familiarity and likelihood to consider are closely correlated, and both metrics are key to determining the success and reach of your marketing and branding efforts.
Below are mean preference ratings for our client schools, on a rating scale where 1=not at all likely to consider and 10=extremely likely to consider:
- Prospective undergrads: mean of 4.5
- Privates: 4.1
- Publics: 5.3
- Prospective grads: mean of 5.2
- No difference between public and private
These three simple metrics form the foundation for an institution’s brand benchmarking efforts. Tracking these metrics and others over time is critical to understanding the evolution of a brand, how well it is performing, and whether marketing objectives are being met. If you have conducted research with us in the past, gather those reports and see where you fall among other institutions we’ve worked with.
This is the third and (for now) final article in a three-part series in which we share general information we have gathered from our work over the past decade. Every day we are gathering more and more data on colleges and universities. As our normative database grows, we will continue to report back on our findings and insights.