This guest post was originally published on the Media Impact Project blog.
Big Data and metrics are changing the way businesses of all types operate in a digital environment. From automotive (think Uber and Waze), to retail (think Amazon and Gilt), to content (think Netflix and Facebook) to advertising (think Google) big data is changing the way today’s most successful businesses and industries operate.
While data has changed the way news organizations are producing and packaging content (think data journalism and data applications), the business of journalism has not been as fast to adopt and adapt to a world where every consumer action and decision point can be tracked, measured and analyzed.
The reasons for this have been stubbornly persistent even within news organizations that grasp the concept, that see metrics and measurement as an opportunity to improve their products and services, and which have committed resources to the effort.
This was a key takeaway from three intrepid and determined news leaders from independent publishers from around the globe after participating in “Measuring Digital Media Impact 101,” a media metrics training course conducted by the USC Annenberg Norman Lear Center Media Impact Project on the USC campus in early June 2016.
The Open Society Foundation’s Program on Independent Journalism sponsored participants from three nonprofit news organizations: El Faro (El Salvador), Efecto Cocuyo (Venezuela), and Himal Southasian (Nepal). The course was led by Dana Chinn, Media Impact Project director. Three other media metrics analysts—Jason Alcorn, Justice Haque and Alana Victor—also helped with hands-on training in both English and Spanish.
Media Metrics Starts with Asking the Right Questions
The program provided an intense if not practical crash-course on leveraging ubiquitous measurement tools to help newsrooms answer the most persistent questions in newsrooms today:
Who is reading our content? What are they doing on our platforms? Where are they coming from and where are they going? What content is and what is not working? How do we translate these findings into action that increases our effectiveness, revenue and impact?
The course syllabus stated the objectives of the program this way: “After completing this course, participants will know how to prepare, interpret and present basic and customized reports from Google Analytics for their staff, board members and funders.
Participants were told that as a result of attending the seminar that they would learn how to:
- Develop questions about measurable opportunities and problems that their organizations need to answer to fulfill their missions, including where their readers come from and how the topics they cover perform.
- Identify site-wide and story-level data that will help answer both editorial and business questions.
- Organize and analyze the data to make it easy to understand and act on both internally (to make their products and services better) and externally (to support revenue generation).
Working from early in the morning to late in the evening for an intense two days, each participant was taken through a rigorous agenda that started with asking the right questions and ended with how to share answers.
One of the first lessons learned is that most of the metrics that people cite today are actually not very good and don’t actually say much about what’s really going on. Hits, page views, and even aggregate demographic information were each knocked off their perches as go to data-points. Instead, analysis of things like bounce rate, primary traffic sources and user segments were all shown to provide a far more nuanced picture with greater potential actionable effect.
At the end of the two days, each exhausted yet enthusiastic participant hopped a plane back to their respective organizations to take back what they had learned and, hopefully, apply it in meaningful ways.
After Training, Publishers Still Need Support
It was my responsibility to find out just how effective the program was in helping these smaller, mission-driven news organizations become more responsive and therefore more effective in their business.
Within a few days of returning back to their homes, we first asked the participants to complete an online survey. What we heard back was not that surprising. While the participants rated the program, the staff and the curriculum very highly, it was clear that each participant would have liked more time to absorb and practice what they had just learned. While the generosity of the Open Society Foundation made this program possible, the agreed-upon budget for the program dictated the time and level of effort — particularly post-seminar — that the Media Impact Project staff were able to provide.
Not surprisingly, when asked what else we could provide to each of these newly minted practitioners when they returned home, the answer was almost unanimously that they needed more support and follow-up. It is one thing to perform a task when you have a one-on-one dedicated resource in the classroom helping with every step. What each participant reported was that it was much more difficult to reproduce the functions and do the work when they were back home on their own.
Two months following the seminar in Los Angeles, we conducted Skype interviews with each participant one-on-one. Those conversations yielded even greater insights into the reality of putting these lessons to work.
Publishers Have High Expectations
“The management team of El Faro are trying to figure it out without having the benefit of being there [in Los Angeles],” said seminar attendee, Ivonne Veciana. “The El Faro team do have some hypothesis; but they want the answer to all of their questions in Google Analytics.”
This reflects a common perception that there are hard and fast answers to these important, but complicated questions. The answer, it turns out, is rarely a yes or no or fixed response, but more often “It depends.” The “on what” is what these organizations have to figure out.
Upon returning to El Salvador, Veciana had a hard time meeting the expectations of her management team who agreed for her to participate and who are anxious to learn more about their business.
“The hypotheses of the management team required additional context, including qualitative data, not just metrics,” said Veciana. In the real world, it wasn’t as simple as identify a business problem, design your questions, form a hypothesis, test and measure it, draw conclusions, act on them and repeat.
That is not to say that she was unable to provide any insights or value. Veciana has become, in effect, the “data whisperer.” In order to provide more context to the data she is providing her team, she has created an index of external site data, particularly of distribution partners, in order to provide more insight into the reach and impact of their journalism.
Metrics Require a Constant Commitment
Laura Weffer, co-founder and editorial director of Efecto Cocuyo, had a similar experience back in Caracas. “Before, my interpretation of the data was much more intuitive,” said Weffer. “Since being in the Los Angeles training I now have a greater understanding of what to look for and how to get it.”
Another lesson learned is that metrics, measurement and interpretation require constant commitment. Said Weffer: “I work with deadlines and work best when I have a deadline to manage to.” As a result she has instituted a weekly report and meeting where she reviews the data of the prior week and communicates her findings to her staff on a regular basis.
Metrics, it turns out, are not just about gaining a better understanding of what’s happening on the site, but also how the technology is performing. Armed with her new knowledge and understanding of how Google Analytics can be used to, Weffer uncovered a data point that was alarming.
“All of a sudden, we were seeing abnormally high bounce rates one day on the website” she said.
After writing to professor Dana Chinn for further help — a common behavior for all the participants — it was discovered that the analytics tags were incorrectly deployed on the site leading to an abnormal and inaccurate reading. “Before, I would not have been able to figure out what was wrong. Now we were able to troubleshoot and fix the problem in a very short period of time.”
Technical Problems Can Be A Deal-Breaker
Unfortunately, problems with technology platforms and the ability to come up with a solution and get it fixed was a limiting factor elsewhere as well. Smita Mitra, desk editor at long-form publisher Himal Southasian, ran into multiple technical issues that negatively impacted the benefits of deploying a rigorous analytical regime. “Unfortunately, the benefits of the program have been delayed until we can address the larger issues of our underlying technology platform,” said Mitra.
What is apparent from each of these participants experience is that this type of training and competence in both deploying and analyzing performance data can be invaluable for these organizations. With this type of training, newsrooms of all sizes can deploy and utilize these tools to gain a far greater understanding of how their product and services are actually being used.
What is also apparent is that most of these efforts will fall short or even fail when newly trained practitioners are not provided with sufficient ongoing technical and analytical support when they are back in the real world.
Given the high expectations going in, and the lack of understanding coming out from management both on the editorial and business side, it appears that more can and should be done to bring the other members of the leadership team up to speed and bought-in to the program.
While all three participants suggested having an additional online resource to refer to when back in the “wild,” it appears that relying on one individual within an organization (no matter what the size) produces less than optimal results and does not promote cultural change.
Instead of flying out a single individual to participate in a training program away from the organization, this analysis leads me to believe that a more effective means would be to teach and train publishers where they are.
As previous programs by the Media Impact Project have demonstrated, the ability to go into a publisher and train both business and editorial leadership on metrics integration not only helps set expectations, but also helps solidify a culture of evaluation that can lead to real action within an organization.
By introducing and maintaining an experimental methodology into a news culture, we can help independent and mainstream publishers alike establish a metrics regime that can lead to meaningful and actionable answers.
Author’s note: Himal Southasian has since suspended publication.
Kevin Davis is a Senior Fellow at the Media Impact Project and the principal of KLJD Consulting, a strategic revenue and development practice specializing in assisting independent news & information companies and their leadership.