Mediatwits #114: Will Robots Take Over Simple Tasks for Journalists?

    by Claire Groden
    March 21, 2014
    Ken Schwencke of the Los Angeles Times joined the podcast to discuss his algorithm-generated post about an L.A. earthquake.

    This week, when a 4.4-magnitude earthquake hit L.A., the L.A. Times’ Ken Schwencke was the first to break the story. But, the actual author of the post wasn’t Schwencke — it was Quakebot, an algorithm he created about two years ago. The article was no masterpiece, but it clearly conveyed the essential information such as the quake’s magnitude and epicenter via the USGS Earthquake Notification Service. Schwecke’s robo-reporter isn’t the only one of its kind, either: just this weekend, the Guardian wrote a post about its own GUARBOT, which was programmer Will Franklin’s attempt to make a robot that could create news pieces with analysis. The article GUARBOT wrote began with, “The crime-ridden family of quinoa…” and didn’t get much better from there. What role do computers have in the future of journalism? Will we see more Quakebots, or even a smarter GUARBOT? This week, we’re joined by Ken Schwencke of the Los Angeles Times and Christer Clerwall, assistant professor at Karlstad University. We’ll also be joined by Andrew Lih of American University, Elise Hu of NPR, and MediaShift’s Mark Glaser will host.



    Listen to the Mediatwits and follow us on SoundCloud!

    Thanks to SoundCloud for providing audio support. Subscribe to the Mediatwits audio version via iTunes.


    Follow @TheMediatwits on Twitter.


    mark glaser ISOJ headshotMark Glaser is executive editor of MediaShift and Idea Lab. He is a longtime freelance writer and editor, who has contributed to magazines such as Entertainment Weekly, Wired and Conde Nast Traveler, and websites such as CNET and the Yale Global Forum. He lives in San Francisco with his wife Renee and son Julian. You can follow him on Twitter @mediatwit.

    AndrewLih_270x210Andrew Lih is a new media journalist and associate professor of journalism at the American University School of Communication. He is the author of “The Wikipedia Revolution” (Hyperion 2009, Aurum UK 2009) and is a noted expert on online collaboration and journalism. He is a veteran of AT&T Bell Laboratories and in 1994 created the first online city guide for New York City (www.ny.com). Follow him on Twitter @fuzheado.


    EliseElise Hu covers technology and culture for NPR’s on-air and online platforms. She joined NPR in 2011 to head up the digital launch of StateImpact, a DuPont award-winning public policy reporting network. Previously, she was a founding journalist at the non-profit digital news startup, The Texas Tribune. While working as a political reporter, she also oversaw the Tribune’s social and multimedia journalism, statewide television partnerships and toyed around with new story forms. Outside of work, she’s an adviser to the John S. and James L. Knight Foundation and serves as a regular panelist for the Knight News Challenge.

    schwankstaKen Schwencke has worked as a data­base pro­du­cer at The Los Angeles Times since 2009. He de­vel­ops news ap­plic­a­tions for latimes.com and con­ducts ana­lys­is for re­port­ing pro­jects. You can follow him on Twitter @schwanksta.

    christerChrister Clerwall is an assistant professor in Media and Communication Studies at Karlstad University, Sweden. His research focus is on online and digital journalism, and he is currently working on a project about algorithmic, or “robot,” journalism, and a project on transparency and trustworthiness in journalism.


    A recent, preliminary report by Professor Christer Clerwall indicates that the quality of robot-produced reports might not be detectably worse than those written by flesh-and-bones journalists. In the study, which compared short sports reports written by L.A. Times reporters and software-generated reports, the robotic stories received high scores for being “descriptive,” “trustworthy” and “objective.” The only statistically significant difference between the types of reports was that the human-generated article was more pleasant to read. But this study and the Quakebot do not necessarily mean every journalist should scramble to learn how to create a writing robot. These reports are short and information heavy, which do not account for the analysis-rich or anecdotal reporting that captures readers’ interests or Pulitzer Prizes. What are the limits of computer-generated reporting? How can this technology be used as a tool for journalist, rather than as a threat?


    L.A. Times Reporter Talks About His Story Writing Quakebot (Poynter)

    Music Sales Fell in 2013 Even As Streaming Revenue Increased (NY Times)

    The U.S. Ends Control of ICANN, Gives Up Backing of the Free Speech Internet (Businessweek)

    The Army’s Robot Recruiter (On The Media)

    FiveThirtyEight Launches at ESPN (FiveThirtyEight)

    Tribune Company Built A Robot That Reads News (Ad Age)

    Claire Groden is the podcast intern for PBS Mediashift and a senior at Dartmouth College. You can follow Claire on Twitter @ClaireGroden.

    Tagged: guardian ken schwecke los angeles times quakebot robot journalism robots sports journalism

    Comments are closed.

  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media