Perspectives on the State of Research Readiness in Public Relations
The following was published in the March 2009 issue of PRSA’s Tactics:
I recently finished teaching the Research Process & Methodology course at
NYU’s master’s program in Public Relations and Corporate Communications. Serving as an adjunct professor is a great outlet for the coaching and mentoring that I’ve missed since becoming a communications and business development consultant in 2006. But this particular experience left me a bit unsettled. It drove home the fact that research remains a weak link in the development, execution and evaluation of PR programs.
Becoming Involved
I was invited by my friend and the academic director of the NYU program, John Doorley, to guest lecture within the program and later teach the Strategic Communication course. Then came a request to teach Research Process & Methodology.
Here, however, I had some reservations. Like Strategic Communication, RP&M was mandatory, but this was a subject that most students dreaded. We were to cover qualitative research, including interviews and focus groups, and quantitative methods such as sample selection, questionnaire design and data collection and analysis. The objections were along the lines of, “We’re the creative people, not number crunchers. We’re supposed to be intuitive — that’s why people hire us!”
Even though I had the feeling of being dropped behind enemy lines, I was energized by the opportunity to convey the need for more rigor in PR programming. Now, more than ever, it is necessary to identify competitive, political and social issues; understand key audiences, their preferences and perceptions; differentiate companies and products, and demonstrate the value of communications on business objectives and measure outcomes.
Clients crave creativity but they don’t want guesswork. To be effective, we need information to drive insight. If PR professionals want to be strategic, then research is required to help make the best possible predictions about the potential intended and unintended outcomes in PR programming.
Getting Down to Business
Most of the students’ anxiety was assuaged soon after starting the first class. Setting unambiguous expectations and goals were key. I made it clear that I did not intend to transform them into statisticians or programmers. Instead, they would become more informed users and evaluators of research. To reach this endpoint, and to keep the students engaged for the six hours per week that we met, we worked to strike a balance between theory and application — a mix of book learning with in-class exercises.
In one class, we reviewed and evaluated a variety of award-winning PR case studies including corporate rebranding, the launch of new products and national events. They were eye-opening — even for me. If these were chosen to represent the best work of the industry, the pinnacle of public relations, then we — the profession — had some work to do. Why did those objectives look like a bunch of tactics? What did “we achieved a high level of buzz,” mean? Was the result at all meaningful to the business? One hundred million impressions is a big number, but did the message reach the intended audience?
This analysis helped make the research process more real, more accessible. By the end of the course, the role of research in public relations was duly elevated in the minds of the students. They didn’t leave with a list of formulas memorized, but they did leave knowing how to ask good questions and take their place at the table of strategic PR planning.
Taking the Next Step
While feeling rewarded by the response of the students and the opportunity to connect with some terrific guest experts, I was left with some concerns. After I reviewed the final projects — detailed proposals for their Capstone papers — and posted the grades, I wondered about the state of research training in PR agencies. How do these firms really value research? What expectations do they have of the staff? How is research taught in other programs outside NYU?
I decided to send a three-question survey to the top 25 agencies. I inquired if they had an in-house training program, which courses they offered and if any were mandatory. Only eight of 25 surveys were returned — perhaps these were the firms that were most proud of their programs.
However, the results were still revealing. As one would hope and expect, 100 percent of the respondents offered a variety of courses and workshops. All of the agencies had programs in the traditional areas of public relations: writing skills, presentation skills and media relations. Some offered training in issues/risk/crisis management, advocacy/third-party relations, conflict resolution and digital media.
On the other hand, only three of eight said some or all classes were mandatory. I found this to be both surprising and disturbing. Competitiveness, differentiation and excellence are all based on learning.
When it came to metrics and numbers, the story did not improve. Only half of those responding reported that they offered training in research methods and/or measurement. Respondents at 50 percent of these agencies offered courses on budgeting and/or forecasting. I wasn’t hugely surprised by these answers, as I’ve known for a long time that numbers and words don’t mix for many PR practitioners.
But it’s time to push old habits aside. Indeed, these tough economic times reinforce the importance of numbers. “Good enough” is out. Precision is in. Of course, creativity and fresh ideas are essential. But we need balance: The same urgency that is placed on learning, thinking and executing the soft side of public relations must also be placed on the hard.