At two workshops hosted by the Engaging News Project, an organization that researches online audience engagement, digital strategists discussed what audience metrics mattered most in their newsrooms.
Representatives from legacy outlets (The Washington Post, The Wall Street Journal, CNN), newer digital ventures (The Daily Beast, Politico, Vox) and nonprofit/public media organizations (The Texas Tribune, NPR) had wide-ranging conversations about the merits and drawbacks of the top data metric platforms, and how to capture both devoted and fly-by users.
During each workshop, the discussion briefly turned to seldom-referenced metrics: audience awareness and knowledge. A newspaper editor argued that journalists have an obligation to measure reader comprehension. A social media editor asked whether newsrooms could solicit feedback from readers about which posts were most useful for understanding news topics.
“Both times we heard someone say, ‘I wish we knew whether people comprehended our work,” said Talia Stroud, director of the Engaging News Project and an associate professor of communication studies at the University of Texas at Austin. “Everyone around the table nodded.”
But few had answers.
That largely remains the case more than one year following the final workshop. There are many reasons why news organizations do not commonly measure whether following their coverage is associated with increased awareness or knowledge of specific issues. They are metrics that don’t neatly fit into a spreadsheet. Measuring them can be costly and time-intensive. Results can be hard to swallow. And there are many other ways to measure impact, including how widely news coverage is referenced and shared, and whether it helps spur changes in public policy or public opinion.
Yet the dearth of newsrooms measuring reader awareness and comprehension doesn’t mean that there’s a lack of interest among journalists in such data, according to Stroud and other engagement experts interviewed.
As more news organizations seek to explain complex topics in engaging ways online, there’s a growing need for data on the effectiveness of their efforts. In a white paper on measuring the impact of nonprofit journalism, ProPublica president Richard Tofel wrote that “Explanatory journalism seeks primarily to elucidate.… The impact of explanatory journalism will be determined by measuring how much readers’ awareness or understanding has increased.”
Foundations interested in investigating the impact of journalism have partnered with news organizations on projects to test ways of measuring audience awareness and comprehension. Funding is available – but only if there’s the right match.
Determining Outcomes, Aligning Incentives
The Bill & Melinda Gates Foundation, which last year spent roughly $30 million in media-related grants, places a heavy emphasis on measuring impact.
“Our goal with the media funding that we do is about raising awareness and deepening knowledge and understanding of the issues that the foundation works on,” said Manami Kano, deputy director for global media partnerships at the Gates Foundation.
Kano previously managed the foundation’s education media grants, including funding for news organizations such as Education Week, Chalkbeat and The Hechinger Report. Kano’s team helped determine the framework for how the foundation thinks about the impact of its media grants.
(Full disclosure: The Gates Foundation is funding MediaShift’s new MetricShift section via its fiscal sponsor, BAVC.)
“In terms of having the outcome we want, it’s really about how has the audience changed? Have they really understood these issues? Do they follow these issues more? Are they more knowledgeable about that? Are they talking to others about it?” Kano said.
The Gates Foundation provides support to media organizations that share some of these priorities — even if they are not the top priority.
“In theory, [the grantees] want the information, but in practice is it the highest priority? It’s probably more of a priority for us,” Kano said.
Questions about which audience impact metrics to measure should start with a news outlet examining its mission, according to many of the engagement experts interviewed.
“If your goal is to generate more awareness or knowledge, it’s absolutely worth figuring out how to invest in that kind of measurement,” said Lindsay Green-Barber, director of strategic research at the Center for Investigative Reporting.
But measuring awareness and comprehension is rarely part of a news organization’s culture, said Jessica Clark, director of research and strategy at Media Impact Funders, a network of grantmakers interested in impact. Clark, who authored a report on impact assessment, said commercial journalism is typically oriented around the audience traffic metrics that matter most to advertisers.
While there is typically no financial incentive for commercial news outlets to measure awareness and knowledge, the incentives may line up for nonprofit news sites that find foundations interested in that mission, Green-Barber said.
An ideal scenario, according to Jim Schachter, vice president for news at WNYC, is “when the thing the funder supports lines up with what you already want to do.” That was the case for the New York NPR affiliate, which launched a cross-platform initiative to increase awareness of health issues and improve health outcomes with the help of several funders, including the Robert Wood Johnson Foundation.
The Rita Allen Foundation, which invests in civic literacy and engagement projects, helps fund efforts by the Engaging News Project, Media Impact Funders and Solutions Journalism Network to measure impact. Elizabeth Good Christopherson, president and CEO of the Rita Allen Foundation, said efforts to measure awareness and knowledge align with its mission of “creating an engaged and informed citizenry.”
Much of the foundation funding to measure impact goes to public broadcasting and nonprofit news sites. Clark said the heavy commercial investment in the for-profit, online news site Vox is a sign of interest in explanatory journalism. Whether or how outlets such as Vox and FiveThirtyEight measure their effectiveness in explaining the news is an open question (attempts to reach editors at both news organizations were unsuccessful).
Dana Chinn, director of the USC Annenberg Norman Lear Center Media Impact Project, said the most important factor in determining whether a news outlet measures impact metrics such as awareness and comprehension isn’t its tax status but instead its leadership team. “Questions about whether and how to measure impact are personal,” she said.
And sometimes, journalists can take the results personally. “It’s not difficult to do any of this,” said Chinn, who teaches and researches audience analytics. “What’s difficult is to see the information and act on it and admit that something isn’t going right.”
Green-Barber said the idea of measuring these metrics is more palatable if news outlets stop thinking about it as an evaluation that may lead to punishment from funders if reader awareness or comprehension is lower than anticipated.
“It’s not about whether you are successful, but instead what worked and what didn’t and how that can inform our next project,” Green-Barber said.
Added Kano: “Once the results come in, people are very interested in what that says and for the most part I’ve seen the news organizations then apply it to their strategies around audience development and content. They realize it’s not just for us to do evaluations but really for them to improve their product and their business to a large degree.”
Different Methods of Measurement
Awareness, which Chinn considers a “first-level” outcome, typically refers to whether audiences can recall news content. Kano said she considers metrics such as active time on a page and scroll depth to be proxies for awareness (though she said the data are imperfect). Chinn said news organizations commonly measure awareness by what articles a user visited online.
Self-reported information (pre-tests and post-tests about awareness of a news topic) is another option. Green-Barber said one of the difficulties with this method of measuring awareness is waiting several months to determine if audiences can still recall the information. Another challenge: asking questions on the pre-survey that don’t make participants feel inadequate.
Green-Barber wrote in a recent report that while changes in laws and policies are “real, discrete events we can measure,” changing awareness of an issue “proves to be much more difficult to know.” In an attempt to measure awareness, she designed a survey completed by residents of Oxnard, Calif., before and after reading a Center for Investigative Reporting investigation into pesticide use in the community. Results showed that residents were significantly more aware of the use of pesticides and concerned about the adverse health effects after reading news coverage in CIR, its distribution partner, The Guardian, and other news outlets that reported on the findings.
WNYC has adopted a different approach to measuring awareness. Over the past several years, the station focused its programming on health issues such as sleep deprivation, sticking to exercise regimens and hearing loss. Most recently, the WNYC podcast Note to Self launched the series “Infomagical” about managing information overload. Partnering with researchers, WNYC asked its audience to follow its news coverage, take part in challenges such as a day without multitasking and report back results.
Paula Szuchman, vice president of on-demand content for WNYC, said one measure of awareness for these projects is whether monthly podcast downloads and newsletter subscriptions increase. Another is the number of people who report their health data back to the station. While Szuchman acknowledges that this self-reported data isn’t scientific, she was heartened that thousands of listeners became increasingly aware of these health issues and reported positive health changes.
Added Schachter, WNYC’s vice president for news: “I go to a lot of meetings with funders, and to be able to tell an accurate, honest story about the depth of our audience’s engagement — that people will stick with us, do this amount of work on our behalf and on our community’s behalf — that’s a terrific story to be able to tell.”
Increasing audience knowledge or comprehension of news content can be more difficult — and more challenging to measure. Chinn considers this a “second-level” outcome and Green-Barber calls it a “micro,” individual-level outcome similar to changes in attitude or behaviors.
Comprehension is often measured through self-reported surveys (what did a person know about a news topic before and after reading a story?) The Engaging News Project helps news outlets develop online polls that test reader knowledge and, study results show, increase their time on page.
Solutions Journalism Network conducted an A/B test in 2014 in which randomized participants read articles about social issues that were either conventionally written or included information about solutions to the problem. Results showed that people who read the solutions-based stories were more likely to report feeling more knowledgeable and better informed about the issues than those who read the conventional articles.
Increasing awareness and knowledge is the ultimate goal of solutions journalism, said Rikha Sharma Rani, intelligence director for the Solutions Journalism Network.
“When writing about solutions you are also writing about the problem and how people have addressed that,” Rani said. “You are giving readers a new perspective on issues that are well covered but not from that angle.”
Chinn said inferences about knowledge can also come from behavioral data about how a user interacts with a news site. Kano said the Gates Foundation often tries to triangulate self-reported survey data about what audiences have learned with knowledge questions posted on news sites.
News outlets can claim that reading their content is associated with higher levels of knowledge about a topic, but proving causation is difficult. And in order to measure changes in knowledge, news outlets have to reach a certain threshold of coverage, according to Chinn.
“If you say you want to have a change in people’s knowledge about the Common Core State Standards Initiative in education policy but in analyzing the content you only have one article a month on that topic, even the best data in the world won’t be able to tell you that much,” Chinn said.
With the right mix of coverage and the will to measure its impact on audiences, news organizations can measure awareness and knowledge, Chinn said.
“All of these actions are measurable,” Chinn said. “It just takes a recipe or cocktail of different methodologies.”
An Investment of Resources
Whether a newsroom conducts its own audience research or uses outside researchers depends on what level of sophistication is needed and what decisions are riding on the data, Chinn said. Soliciting readers to take online polls or track their news habits can provide useful information to newsrooms, but it isn’t the most scientific sampling method because of self-selection bias, Stroud said.
The vast majority of news organization don’t have a dedicated researcher on staff to work on such projects. “Newsrooms are busy places and are concerned about costs,” Stroud said. “They don’t have time to test every article.”
That’s true for ProPublica, which charts impact through measures such as prominent reprints or references to a story and public policy changes, but has yet to systematically measure whether reading its coverage is associated with increased reader awareness or knowledge.
“In a perfect world where it didn’t cost money and time, I’d love to know that,” Tofel said. “I just don’t know of any news organization that has the money and the time to spend a lot of both researching these things.”
Kano acknowledges that measuring and reporting impact metrics is time-consuming. “What we are requiring is a shift,” she said.
At the Engaging News Project workshop, participants agreed that it’s a necessary shift. Among the group’s recommendations: new metrics should be created based on the content of an article to determine if readers are benefiting from a story.
Tom Negrete, who was the Sacramento Bee’s director of innovation and news operations at the time of the workshop, made the suggestion about measuring comprehension that helped spark the conversation about what metrics should matter most. Now the associate editor of CALmatters, a nonprofit that covers politics and policies in California, Negrete is working with researchers on ways to measure audience comprehension of news coverage.
“It’s a whole different conversation if an editor can say, ‘if you write your story this way people will understand it better,'” Negrete said. “For journalists, that’s a more motivating principle than just getting clicks.”
Elia Powers, Ph.D., is an assistant professor of journalism and new media at Towson University. He writes regularly about news literacy, audience engagement and non-profit journalism.