• ADVERTISEMENT

    Opinary Case Study: How to Ask More Engaging Questions

    by Simon Galperin
    June 19, 2017
    Photo by Jonathan Simcoe used with Creative Commons permission

    Opinary leverages a process as old as time (asking questions) with the theory that people have opinions more often than they have comments. We’ve learned a thing or two about what makes a good question.

    The definition of good? One that makes the audience (ourselves and you included) want to answer.

    Questions on politics increase engagement 2.7 percent above the average.

    The platform we use to ask those questions is Opinary’s primary product: a polling widget with an average engagement rate of 18 percent. That’s about one in five people leaving their opinion where one in 100 leave a comment. Here’s an example from PRI:

    ADVERTISEMENT

    Last year, two extraordinary Opinary interns, Rosemarie Foulger and Matthew Baughman, examined 1.2 million opinions shared on 923 Opinary polls. Using a variety of statistical methods, especially Genetic Matching, Rosemarie and Matthew reported causal links between the features of Opinary polls and rates of engagement. Here’s what they learned.

    What type of opinions the audience is more likely to share

    ADVERTISEMENT

    Opinary categorized the questions we asked into six types that described the sort of response a person was being asked for. In the chart above, you can see descriptions and examples of the six types: past value, present value, future value, prediction, evaluative, and survey.

    We found that a future value question, in which a person was asked to assess a moral value in the future, received a 2.8 percent higher engagement rate than any other question. Future value questions also do better than prediction and evaluative questions. Survey questions do better than present value questions. Asking for a past value instead of a future value decreased engagement by 6.1 percent.

    What topics do audiences want to leave opinions on

    Opinary separated questions based on topics: politics, celebrities, culture, social media, sports, and innovation (technology).

    First: celebrity questions performed most poorly in comparison across the board, decreasing engagement by 9.6 percent below the average. Questions on politics increase engagement 2.7 percent above the average. Social media questions are more highly engaging, too. Those are questions like “Is Facebook taking the correct action against clickbait?”

    Sports questions had no significant effect on engagement.

    Stronger opinions are more likely to be shared

    Above, you see a graph that shows that the stronger someone’s opinion, the more likely they are to share it.

    The Original Groupings range represents the strength of opinions as measured by where a person places their avatar along an Opinary poll’s opinion scale. The strongest opinions are zero and the weakest are 50. The above example from Time Inc.’s Money

    The treatment bars indicate what occurs to engagement rates when questions are posed to elicit stronger opinions. So, if you changed a 0-10 strength opinion poll to a 20-30 strength opinion poll, you would lose 8.4 percent engagement.

    Let’s ask better questions

    My tenure at Opinary has come to a close and this is the last piece of knowledge from that work that I’ll have the pleasure of sharing with the journalism community. The point has been to create space for more constructive interaction between newsrooms and their audiences. I hope that by sharing this case study on engaging questions, Opinary can continue to help journalists connect more deeply with their audience.

    Simon Galperin is U.S. head of growth for Opinary. If you have questions or want to take a closer look at the case study, contact Opinary co-founder Pia Frey at [email protected]

    Tagged: comments engagement opinary politics questions social media
  • MediaShift received funding from the Bay Area Video Coalition (BAVC), which receives support from the Bill & Melinda Gates Foundation, to launch the MetricShift section to create a vibrant hub for those interested in media metrics, analytics and measuring deeper impact.

    About MetricShift

    MetricShift examines the ways we can use meaningful metrics in the digital age. We provide thoughtful, actionable content on metrics, analytics and measuring impact through original reporting, aggregation, and audience engagement and community.

    Executive Editor: Mark Glaser

    Metrics Editor: Jason Alcorn

    Associate Metrics Editor: Tim Cigelske

    Reader Advisory Board

    Chair: Anika Anand, Seattle Times Edu Lab

    Brian Boyer, NPR

    Clare Carr, Parse.ly

    Anjanette Delgado, Gannett

    Hannah Eaves, consultant, Gates Foundation

    Alexandra Kanik, Ohio Valley Resource

    Ian Gibbs, Guardian

    Lindsay Green-Barber, CIR/Reveal

    Celeste LeCompte, ProPublica

    Alisa Miller, PRI

    Connect with MetricShift

    Facebook group: Metrics & Impact

    Twitter: #MetricShift

    Email: jason [at] jasalc [dot] com

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift