Several years back, when “The Social Network” was playing constantly on cable and Eli Pariser’s “The Filter Bubble” was fresh on my mind, I decided to take a highly unscientific poll in my college journalism class about what students knew about news personalization on Facebook.
Using that framing seemed problematic (teachers tend to have a sixth sense about what questions will elicit blank stares), so I decided to start with a broad icebreaker: “Tell me what you know about how the News Feed works.”
Students spent the next few minutes explaining the basic mechanics of Facebook – friending and following; scrolling, scanning and clicking; liking, commenting and sharing. They clearly were familiar with the Facebook interface and their options for interacting with content. But they said little about how their actions affected the composition of their News Feed and whether Facebook prioritized certain posts.
Not only did the terms personalization, algorithm and filter bubble rarely come up in conversation, comments like “I think Facebook just posts the most recent stories at the top” and “I’m probably just missing my friends’ posts if I don’t see them” indicated that many students had a fundamental misunderstanding of how Facebook’s News Feed algorithm filters and prioritizes news.
The Implications of News Personalization
Facebook is one of many digital media sources that use personalization algorithms to tailor news to users’ tastes. As Pariser explained in his 2011 book, and as journalists and Facebook itself have documented in recent years, the News Feed algorithm tracks a range of user signals – interactions with a friend or page, status updates, likes, shares, etc. – to prioritize hundreds of stories (out of a much larger potential pool) to show each day.
When I explained this to students, many seemed genuinely surprised. When I showed them how they could minimize personalization by manually selecting “Most Recent” rather than “Top Stories,” or how they could “take control and customize” their News Feed (Facebook’s own words) by editing their preferences to prioritize who to see first or unfollow people to hide their posts, most were unaware of those options.
When I asked them why this all mattered, what the implications are of news personalization and why it’s important to be aware of tools available to modify whether and how news filters are applied, the room fell silent (my sixth sense about avoiding blank stares is imperfect). One student broke the silence by describing a virtue of personalization: helping to lessen information overload, and identifying status updates and news stories of likely interest.
I followed up with the more troubling implications: the emergence of filter bubbles that may lead to consuming a narrow selection of news sources and perspectives, the use of non-transparent factors to select and rank news, and the potential for algorithms to introduce invisible biases. Algorithms are not value neutral, I told students. They are embedded with human judgments – sometimes at the center of high-profile controversies – made by Facebook editors, engineers, curators and users paid to provide feedback on how they use the platform. Tweaks to Facebook’s algorithms can vastly change the composition of users’ News Feed and trending topics section.
Given Facebook’s popularity among young people and its dominant status as the gateway to finding news online, students need to understand how news has been filtered using personalization algorithms. “These type of algorithms create a news literacy issue because if readers don’t know they are influencing content, they cannot make critical decisions about what they choose to read,” Jihii Jolly argued in Columbia Journalism Review. “In the print world, partisan media was transparent about its biases, and readers could therefore select which bias they preferred. Today, readers don’t necessarily know how algorithms are biased and and how nuanced the filters they receive content through really are.”
With anecdotal evidence of students’ lack of awareness of news personalization, and with an assist from a pithy Pariser Ted Talk, I began incorporating a lesson into my college classes – and any classes I could visit as a guest lecturer – about how personalization influences our media diet.
If I couldn’t pop students’ filter bubbles, at least I could introduce them to news personalization and the implications of algorithmic filtering.
Still, I wanted more tangible evidence of students’ (lack of) awareness of personalization to bolster my argument for why it deserved attention in college classrooms, and to inform my teaching of the topic. So I decided to conduct a two-part study, recently published online by Digital Journalism.
Interviews Reveal Limited Student Awareness of News Personalization
Part one of the study involved interviews with 37 college students about the ways in which the sources they use to begin a news search apply editorial judgments and track user data to determine news selection and prioritization. Conversations focused on the places where students most often seek out or come across news – portals such as Facebook, Google, Twitter and Reddit, as well as news outlets such as CNN, The New York Times and The Huffington Post that produce at least some of their own content.
Personalization is commonplace on social media platforms, news aggregators, search engines and news outlets that want to show readers more of what they like and less of what they don’t. I asked students what they knew about how news is selected and prioritized on the websites they use as news gateways, and whether their (or other users’) digital media habits affect what is shown or how news is displayed. To avoid priming students, I never used the terms personalization or algorithms.
Here’s what I found: “Interview results show that most students who began news searches on portal sites that use personalization algorithms were aware that user data are collected but could not give specific examples, and were unaware of the role that editorial judgments played in news selection and prioritization. Those who began their news search at a news outlet were largely unaware of whether user data were tracked and what editorial judgments beyond popularity were applied.”
- Students who began their searches on news portals rarely referenced the use of algorithms and most were unaware of how personalization takes place. Among the common responses: “I know there’s an algorithm involved [in Facebook] but I didn’t know what it entails” and “Google collects data from what you search…I’m sure that helps them somehow.”
- Students who started searches at news outlets were mostly unaware of the news selection process, and largely unable to correctly identify whether the outlet they go to for news tracks their digital media habits to personalize news (most incorrectly thought that no personalization occurred).
Survey Finds Students Are Largely Unaware of Personalization on Facebook, Google
Part two of the study used an online survey to examine how aware college students are of news personalization, and specific actions and criteria that affect news selection and prioritization on Facebook and Google, two highly influential news gatekeepers.
These 147 students (not the same ones who previously sat for interviews) answered questions about how Facebook and Google track user data to deliver personalized content. Once again, no questions explicitly referenced personalization, customization or algorithms to avoid priming students.
Here’s what I found: “Results show that students were largely unaware that these influential news portals tailor news to their tastes and almost never referenced the use of algorithms or personalization. They were most aware that their actions and those of their friends affect news selection and prioritization, and least aware of the influence of human value judgments.”
- When asked, “Does Facebook’s News Feed always show you every news item posted by the people or organizations you follow?” few (24 percent) were aware that Facebook prioritizes certain posts and hides others from users’ feeds. Most either believed every post is included (37 percent) or were unsure (39 percent).
- When asked, “If you and someone else separately entered the same search terms for news at the same time on Google, are you both likely to get the same results?” few (25 percent) said the results would likely be different. Most either believed that they would be the same (59 percent) or were unsure (16 percent).
- A majority of students (61 percent) were aware that Facebook lets them adjust their News Feed preferences to affect the posts they see and in what order, but a minority (23 percent) were aware that Google News allowed them to customize the news they see and in what order.
- A majority of students (61 percent) were aware that their actions or usage history on Facebook (following, clicking, liking, etc.) and (69 percent) the actions their friends or organizations they follow take on Facebook (commenting, sharing, etc.) affect news selection and prioritization. Few (18 percent) were aware that actions that Facebook users they don’t follow take (aggregate page views and shares) or (30 percent) actions taken by Facebook engineers, editors or curators (tweaking the algorithm, considering editorial judgments made by news outlets) also affect the composition of their feeds.
- Likewise, a majority of students (61 percent) were aware that their actions on Google and (52 percent) their actions on websites other than Google affect news selection and prioritization. Few (26 percent) were aware that actions taken by other users or (31 percent) actions taken by Google engineers, editors or curators affect the news results.
Students were often aware of the more visible and intuitive ways that personalization algorithms operate tracking user behavior and preferences such as past searches, clicks, likes and shares to prioritize news of likely appeal. But they were often unaware of the less-visible elements of personalization – namely actions taken by users not in their social networks and the influence of human judgments in shaping personalization algorithms.
Lessons for the Classroom
As I argue in the study: “It is unrealistic to expect young adults to learn what personalization is, how it works, and why it matters without specific exposure to the topic in school curricula. Given the complexity of the subject matter, it is logical to address news personalization in college, where courses on computer –human interaction and news literacy are often offered.”
One of this study’s main takeaways is that students learn some important lessons about personalization by interacting with content on Facebook and Google. However, as I note, “deductions made are incomplete and often inaccurate.”
This study set out to examine what aspects of news personalization are poorly understood by students and, thus, deserve greater attention in the classroom. Among the lessons I recommend:
- An overview of algorithms
- The implications of news personalization (including filter bubbles)
- The specific types of user data collected (most notably difficult-to-observe factors such as geolocation and aggregate page views)
- The human judgments that go into programming algorithms and evaluating news sources
- Customization options on portal sites such as Facebook (managing account settings or preferences) and Google News (selecting the types of news or sources to include)
There’s still plenty to learn about the best ways to teach these topics, and whether exposure to such lessons affects students’ news consumption habits or perceptions of news sources that use personalization algorithms.
This two-part study was conducted before the 2016 U.S. presidential election that raised concerns — expressed by Pariser shortly after Election Day — about fake news/misinformation and the filter bubble. Given the current political climate and challenges facing the news media and educators, I’m as motivated as ever to teach students about the implications of news personalization.
Elia Powers, Ph.D., is an assistant professor of journalism and new media at Towson University. He writes regularly about news literacy, audience engagement and non-profit journalism. He received his Ph.D. from the University of Maryland and his undergraduate degree from Northwestern University.
Thanks to browser add-ons like FBPurity, many of us simply turn off the news column entirely, which is the ultimate bubble. Furthermore, FBPurity allows the Facebook user to enter a list of filter words that will remove wall posts using those words, especially from friends who are obsessed with politics but who otherwise post enjoyable vacation and baby photos that I would like to receive.