• ADVERTISEMENT

    How Journalists Are Using New Tools for Collaborative Journalism

    by Anika Gupta
    November 10, 2017
    From left to right, Anika Gupta, Amy Zhang, Corey Haines and Michael Reifman at the 2017 Computation and Journalism Symposium (Photo by Seong Jae Min)

    Experiments like WikiTribune, the collaborative news outlet created by Wikipedia founder Jimmy Wales, excites me. I love the idea of professional journalists working alongside members of the audience, sharing skills and knowledge. I love the feedback loop between users and creators, and have seen the productivity and partnership that can shine through in the spaces where these two designations meet. Collaborative projects – where news organizations and audiences tell stories in partnership – are also a potential way to address misinformation and build trust.

    At the Computation and Journalism Symposium, which took place October 13 and 14 at Northwestern University in Evanston, Ill., I sat down with three co-panelists to talk about the exciting new tools they’re building for collaborative journalism.

    Collaborative projects are also a potential way to address misinformation and build trust.

    We discussed everything from the New York Times’ new comment moderation software to experiments in more transparent investigation. Below are a few highlights from our conversation. A note on the panelists: Corey Haines is Chief Technology Officer for the engagement startup Hearken, while Amy Zhang experiments with new forms of online discussion in her research as a computer science graduate student at the Massachusetts Institute of Technology. Michael Reifman recently joined the New York Times, where he’s a senior engineer on the community team.

    ADVERTISEMENT

    Reading (and Fixing) the Comments

    Comments have been around for years, and they still matter.

    The New York Times has been experimenting with automatic comment moderation through a partnership with Google Jigsaw, the Alphabet incubator that builds security technology. Until recently, a human moderator had to look at and approve almost every comment that appeared on the Times website. Now, thanks to their new software, certain comments are auto-approved while other, potentially troublesome, comments get deferred for review by human moderators.

    One big challenge with a system like this is that readers don’t always know what constitutes a problematic comment, and a phrase or word that’s appropriate in one setting might be totally inappropriate in another. How does the Times’ auto-moderator filter for context, and how transparent is it about its process? Reifman admitted that it’s a bit of a “black box” and says that the Times is proceeding “with caution.”

    ADVERTISEMENT

    But why have comments at all? Corey Haines, whose startup builds engagement tools for public media and small-scale publishers, wondered whether comments – even well-moderated ones – are necessary.

    “If we want to provide a place for community, a threaded stream is a pretty bad way to build community,” he said. “If you’re saying you want comments to be like a forum, then build a forum.”

    But Zhang, who builds experimental tools that reframe online discussions, suggested a few ways that a new generation of comments could break into more collaborative territory.

    “I don’t think there’s a distinction between ‘forums are one thing and comments are another thing’,” she said. Zhang pointed to experiments like “designing how the comments are shown, letting people organize, summarize, annotate, direct comments to some people not others” or even “breaking up the audience” as ways to completely change the commenting experience by offering more control to readers and more value to journalists.

    “There’s a richness of engagement that we’re missing and that can be [achieved] by providing better tools to users and to moderators,” Reifman said. He said the main focus for the Times right now is achieving scale, but that in the future, they’re looking at letting readers search and explore comments.

    Zhang pointed out that if there’s a need to summarize and organize comments, readers might be willing to do it, Wiki-style.

    The Coral Project, a collaboration between the New York Times, the Washington Post, and the Mozilla Foundation, recently rolled out a new tool called “Talk” to their first partner, the Washington Post. According to Coral’s website, Talk is a “a streamlined system that can improve how people behave and interact in the comments space, and allow moderators to more easily identify and remove disruptive comments.” In a demo on the Coral website, the community’s guidelines appear at the top. Readers and commenters respond to each other’s comments by clicking a “Respect” button rather than a “Like” one, since research has shown that a “Respect” button encourages readers to engage with diverse points of view.

    Tools like Talk and auto-moderation might help organizations create more valuable discussions, which in theory could create more space and better avenues for richer collaboration.

    Engineering Empathy

    Good technology blocks offensive comments and online harassment. But great technology addresses why harassment exists in the first place. Or, as Zhang explained, there are ways that thoughtful commenting interfaces might encourage greater empathy among collaborators.

    In an experiment, Zhang and her MIT lab asked readers to annotate news article with “moral framing.” The researchers drew on Moral Foundations Theory, which organizes human thinking around five core sets of moral values (watch Jonathan Haidt’s TED talk about the theory).

    The goal of the experiment, Zhang says, was to get readers “to think about the underlying moral values that are being deployed in an argument” while reading a set of articles about immigration. By thinking about others’ core moral values as well as their own, it would hopefully be easier for readers to contextualize moments of disagreement.

    Larry Birnbaum, a professor of Electrical Engineering and Computer Science at Northwestern, asks a question during one of the panels at CJ 2017. (Photo by Seong Jae Min)

    Zhang says the research team saw an increase in empathy in their test users after a week of framing discussions in this way. (NewsFrames has a fuller explanation of the experiment and its results.)

    From a news organization’s perspective, tools that encourage empathy could help promote more diverse and productive conversations around the news. For readers, it helps them collaborate with the process – even after the fact – by emphasizing that journalists, like anyone else, have points of view that can be debated.

    Transparent Reporting

    One of the big problems with comments, according to the founders of Hearken, is that readers only get to engage after a story is published. But what if they could engage from the start, including helping the newsroom decide what stories to cover and how? That’s the premise behind Hearken, or as Haines said: “How do you change the culture to bring the audience into the process?”

    When a radio station in Michigan wanted to do a series on mental health, Hearken helped them build tools that let the station ask readers what they wanted to know about the topic before the series even began.

    Asking questions can help news organizations target their stories in an era when clickbait articles make it hard to figure out what readers actually value. Metrics like time on site and pageviews are “passive signals,” said Zhang.

    “People are being inundated with information and don’t have the tools to manage all the clickbait that’s coming at them,” she said.

    Hearken is also working on a new tool that will let readers ride along while a story is being reported, offering feedback along the way. Variations of this idea exist at other outfits, as well. The Dutch journalism outlet De Correspondent refers to it as “being open about the new things you’re learning as a journalist” and the end result, they say, is that stories are more accessible.

    The goal behind these tools isn’t to reinvent journalism, but to acknowledge that in an era of distributed information and diverse perspectives, the best defense against misinformation is a partnership between journalists and a media-literate population. When it comes to core values, collaborative journalism takes two-way trust as a given: journalists and audiences have something of value to offer each other. News organizations and media companies are still working on building the right tools for that level of trust, as well adopting the right mentality, but hopefully they’re getting there.

    Anika Gupta has been a product manager, user researcher and travel writer. Her product work focuses on collaborative journalism. She lives in Washington DC. You can find more info about her at digitalanika.com or @DigitalAnika. 

    Tagged: collaborative journalism comments section engagement google jigsaw new york times online harassment transparency

    Comments are closed.

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift