Futures Lab Update #64: Social News via Storyful; Emotion Detection From Eyeris

    by Reuben Stern
    June 18, 2014
    The startup Eyeris has developed a platform that can recognize and record a viewer's emotional reactions in real time as the person watches video content.

    This week we learn how the social media news agency Storyful serves up user-generated content to news organizations, and we find out how the startup Eyeris measures viewers’ emotional reactions to video content.

    "User-generated content will become probably the primary source of content for all news organizations around the world, whether they’re newspapers or TV. The value is in how you take that UGC, contextualize it and turn it into a story. There has never been a greater need for that service." – Mark Little, CEO and Founder, Storyful

    PART 1: Storyful

    Storyful is a social media news agency that scans the social web to deliver verified user-generated content to news organizations and provide editorial support to source, verify and contextualize the content. Founder and Chief Executive Officer Mark Little explains how the company offers a new model of reporting.
    Reporting by Tatiana Darie.
    [To skip directly to this segment in YouTube, click here.]


    For more information:

    Storyful recently launched the FB Newswire, a partnership with Facebook that allows both organizations and individuals to access verified content shared on Facebook in real time.

    The company has also partnered with Google to create the Open Newsroom on Google+. It allows people who witness, observe or have knowledge about a subject to come together and analyze stories.


    PART 2: Eyeris

    The startup Eyeris has developed a platform that can recognize and record a viewer’s emotional reactions in real time as the person watches video content. We hear how it works from Eyeris founder and Chief Executive Officer JR Alaoui. We also hear from Paul Bolls, co-director of the PRIME Lab at the University of Missouri, about how the technology might fit with the needs of advertisers.
    Reporting by Colin Hope and Reuben Stern.
    [To skip directly to this segment in YouTube, click here.]

    Reuben Stern is the deputy director of the Futures Lab at the Reynolds Journalism Institute and host and co-producer of the weekly Futures Lab video update.

    The Reynolds Journalism Institute’s Futures Lab video update features a roundup of fresh ideas, techniques and developments to help spark innovation and change in newsrooms across all media platforms. Visit the RJI website for the full archive of Futures Lab videos, or download the iPad app to watch the show wherever you go. You can also sign up to receive email notification of each new episode. 

    Tagged: artificial intelligence Eyeris storyful user-generated content videos

    One response to “Futures Lab Update #64: Social News via Storyful; Emotion Detection From Eyeris”

    1. Paul Bolls says:

      First I want to say that I really appreciate the attention in this story given to facial recognition, a hot and exciting research technology. The company featured in this video is doing some exciting work from a technology perspective. However, this story was edited to make me appear a lot more supportive of this technology than I actually am. Video that did not make it into the story included me discussing how every measure of human response has strengths and weaknesses and there are SIGNIFICANT weaknesses in facial recognition as a measure of human emotional response. The proprietary nature of the supposed science behind technologies that are marketed as a “new” research methodology prevents transparency that is necessary for valid and extensive scientific inquiry into the validity as well as practical value of the technology. This is especially problematic, in my opinion, when the companies selling the technology as a marketing research or media content optimization tool have no scientific background in Media Psychology (Media Psychophysiology) the emerging discipline focused on advancing scientific understanding of how individuals mentally perceive, interact with, process and respond to media content and technology. Anyone interested in learning more about these disciplines please reach out to me at [email protected]

      Paul Bolls
      Creative Media Science
      Co-director, PRIME LAB, University of Missouri School of Journalism

  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media