• ADVERTISEMENT

    How User Feedback Informed My Data Journalism Training Program

    by Kuang Keng Kuek Ser
    April 24, 2015
    DataN, an affordable, customized training program in data journalism for small newsrooms, has found its first user.

    Since January, I’ve been developing DataN, the affordable, customized training program in data journalism for small newsrooms. Now the program has found its first user: Malaysiakini.com, the most visited news website in Malaysia (and also my former employer).

    The DataN training program consists of:

    "To my surprise, the participants started to brainstorm on data journalism project right after the training and came up with seven ideas, several of which are under discussion."
    • Pre-training survey and interviews to understand what the newsroom needs and how it operates
    • Two days of training at Malaysiakini’s office on March 7 and 8
    • Post-training support (currently underway) that provides consultation and collaboration on data projects

    Outcomes

    Based on feedback from the 10 participating journalists as well as the data and graphical components they produced in the first two weeks after the training, I would say the program has kickstarted a positive beginning.

    ADVERTISEMENT

    From our post-training communication, 6 of the 10 participants have produced at least seven data/graphical components in their stories during the first two weeks after the training. The components include charts, maps, and interactive graphics.

    Six of the ten participants produced at least seven data/graphical components in their stories.

    Six of the ten participants produced at least seven data/graphical components in their stories.

    I’m in the midst of measuring the performance, especially the time spent on stories that include these components, against the average numbers to further quantify the impact of training. But it could be tricky, as Malaysiakini does not analyze the web analytics it collects.

    ADVERTISEMENT

    To my surprise, the participants started to brainstorm on data journalism projects right after the training and came up with seven ideas, several of which are under discussion.

    From a qualitative perspective, the training was well-received. Here are some testimonials from the participants:

    “We are often impressed by the informative, interactive and sleek features on international news sites, but it never comes to mind that Malaysian news sites can do the same, simply because that is not the norm here. Hardly anybody, if any, is using this methods. The training was eye-opening in that it provided exposure to modern technique in journalism that can be applied in any kind of newsroom. It also breaks the traditional perspective that the role of producing journalism content is only for journalists. Clearly, modern content requires close collaboration with those having different skill sets, including programming and web design.”

    “Although I felt the two-day course was a little too tight to cram in all things at once, it has been an enlightening lesson using and incorporating current technology and tools to convey our stories. It was an eye-opener for many of us who are still in the conventional and conservative method of reporting through text and images. Worth every minute spent attending the course. Highly commendable to fellow journalists.”

    Another important qualitative measurement of training outcomes would be whether the training has changed the culture of the newsroom to be more proactive and creative in exploring different ways of storytelling. I plan to conduct another survey with the participants and other members of their newsrooms next month to gauge this information.

    The preliminary outcomes, both quantitative and qualitative, have proven my fundamental assumption that DataN is something that small newsrooms like Malaysiakini need.

    The next question is: how can I use the experience with Malaysiakini to improve my products and business plan?

    Photo taken from author's original blog post.

    Photo taken from author’s original blog post.

    What does Malaysiakini want and do?

    In design thinking, we are told that what users do (and don’t do) is more important than what they say they want (and don’t want). In the case of Malaysiakini, I found that journalists’ answers and behavior were pretty consistent, and they have a profound impact on my products.

    Before the training, they told me they didn’t want to learn data journalism per se; they wanted their journalists to be more creative in storytelling and realize there are different ways to tell different stories.

    The ultimate goal, of course, is to differentiate their products from competitors’ and to increase their web analytics (page views, time spent, recirculation, etc.). Since Malaysiakini’s business model is essentially based on subscription and advertising, producing engaging and high-quality content is key to the company’s bottom line.

    From my previous experience as a journalist at Malaysiakini from 2005 to 2013 as well as my communication with the participants, I know the journalists’ plates are full with daily breaking news, a norm in many small newsrooms. It has hindered their capacity to explore and experiment with new storytelling tools and approaches.

    In short, “we want to be creative, but we have no time.” So, how does this situation affect my products?

    The demand for creative storytelling, not just data journalism, struck me. Hence, I adjusted the training content to include non-data storytelling tools such as TimelineJS, StoryMapJS and the network visualizer tool KUMU. The post-training feedback showed that I had made the right decision. The participants said the possibility of using these tools in their stories was high. Another tool that received positive feedback in term of likely use is the mapping tool CartoDB. Data journalism tools like Tabula, Import.io, OpenRefine, and Datawrapper, which I thought were critical, performed slightly less well.

    This insight should be read together with another finding. Compared to data visualization and other theoretical parts, more participants said data scraping (Import.io and Tabula) and cleaning (OpenRefine) were harder to learn. This came as no surprise since their learning curve is steeper.

    The findings

    User demands and behavior:

    • Journalists want to learn creative storytelling, not data journalism per se.
    • Journalists don’t have time to practice or use more sophisticated tools.
    • After the training, participants started using simple tools to produce quick-hit interactive components.

    Training feedback:

    • Two-day training is too short to learn so many tools (12 in total).
    • Data scraping and cleaning skills are harder to learn.
    • Data scraping and cleaning tools are less likely to be used.

    The above findings may warrant some adjustments to my product offerings:

    • Remove data scraping and cleaning (OpenRefine, Tabula, Import.io) from the training and group them as another training package: an Advanced Data Journalism course. The basic training can still include a brief introduction to these skills.
    • Provide data scraping and cleaning as an outsource service to newsrooms since they don’t need it frequently, and those skills require time and effort to master.
    • Break the two-day training into different flexible training packages – interactive storytelling, data visualization, and data scraping and cleaning.

    However, I would have to work with more newsrooms to find a broader pattern before making any significant changes to my products.

    Photo originally posted on author's blog.

    Photo originally posted on author’s blog.

    Technical Lessons

    • Do not assume participants’ Internet skills. Ask them through your survey. A few participants were not familiar with Google Drive, which I used to share files and datasets. Google Drive issues slowed down the training.
    • Participants have different levels of Internet skills. Some took longer to follow the training.
    • Expect the unexpected. Many unusual technical issues cropped up during the training: slow Internet connection, incompatibility with browser or Javascript, and issues with OS when installing tools.
    • Participants wanted more time to practice tools and less time in explaining theory/concepts.
    • They also wanted a session to brainstorm story ideas and work on real data projects. The three-hour Data Expedition, scheduled for the last session, was canceled because of time constraint.
    • Make sure the tools that need to be installed in newsroom’s server (Datawrapper) is working properly before training the participants to use it.
    • Too overwhelming for some participants because so many tools were introduced – 12 in total.

    Solutions

    • Make sure all participants know how to use Google Drive before the training, perhaps by pairing them up so they can teach each other.
    • Allocate more time to each hands-on session. Almost every such session took longer time than I expected.
    • Reduce the number of tools and theoretical sessions, and allocate more time to real projects.
    • Pair less tech-savvy participants with fast learners during training.
    • Assign a coordinator to make sure everybody has all necessary tools installed and functioning before training.

    This post originally appeared on the author’s personal blog.

    Kuang Keng Kuek Ser is a Tow-Knight fellow and was a journalist with Malaysiakini, the only independent news website in Malaysia practicing watchdog journalism in a highly-censored media environment. In 2013, he was awarded a Fulbright scholarship to further his study at New York University’s Studio 20, a journalism M.A. program that focuses on new media and digital innovation. Keng has interned with NBC Local Media to produce data and visual components. He was also a data visualization consultant at Foreign Policy.

    Tagged: data data journalism international malaysiakini ndata technology tow knight

    Comments are closed.

  • MediaShift received funding from the Bay Area Video Coalition (BAVC), which receives support from the Bill & Melinda Gates Foundation, to launch the MetricShift section to create a vibrant hub for those interested in media metrics, analytics and measuring deeper impact.

    About MetricShift

    MetricShift examines the ways we can use meaningful metrics in the digital age. We provide thoughtful, actionable content on metrics, analytics and measuring impact through original reporting, aggregation, and audience engagement and community.

    Executive Editor: Mark Glaser

    Metrics Editor: Jason Alcorn

    Associate Metrics Editor: Tim Cigelske

    Reader Advisory Board

    Chair: Anika Anand, Seattle Times Edu Lab

    Brian Boyer, NPR

    Clare Carr, Parse.ly

    Anjanette Delgado, Gannett

    Hannah Eaves, consultant, Gates Foundation

    Ian Gibbs, Guardian

    Lindsay Green-Barber, CIR/Reveal

    Celeste LeCompte, ProPublica

    Alisa Miller, PRI

    Connect with MetricShift

    Facebook group: Metrics & Impact

    Twitter: #MetricShift

    Email: alexandra [at] rationalact [dot] com

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift