Increasingly, Public Media 2.0 projects are moving not only beyond broadcast to social and mobile platforms, but into the realms of digital and media literacy training. Producers of such projects recognize that in order to participate fully in the new media world, children and adults need to be able to access, analyze, evaluate and communicate messages in a wide variety of forms.
While there has been some controversy over semantics, for the purposes of this series, we used the term “digital and media literacy,” which encompasses the foundations of traditional media literacy while emphasizing the importance of access to and informed use of digital tools. These types of programs help people to create their own media messages, participate in cross-platform civic dialogue, recognize and evaluate the messages implicit in media, assess the credibility of news and information sources, and understand the risks and responsibilities associated with social media and media production.
Strong, national support for digital and media literacy initiatives is currently lacking — both in the public broadcasting and educational sectors. However, innovative programs are popping up across the country, sometimes in unexpected locations.
Snapshots from the Field
Our series examined initiatives from diverse sources, including public broadcasting stations, non-profit organizations, museums, schools and federal agencies, all designed to help users become fully engaged media consumers and producers. Each of the initiatives had a different focus (building students’ journalism skills, recognizing hidden advertisements, bringing public media to underserved communities, etc.) They took place in person and online, in school and community-based settings, and in both kid- and adult-focused arenas. Five of the most interesting projects included:
- The PBS NewsHour Student Reporting Labs: This program, which recently completed a successful pilot year, pairs high schools with public media professionals in order to create investigative video reports. The program combines digital and media literacy, media production, news and current events and journalism education and includes a flexible curriculum developed by Temple University’s Media Education Lab.
- Admongo: Last year, the Federal Trade Commission launched Admongo, an online gaming initiative aimed at helping 8- to 12-year-olds “become more discerning consumers of information.” The centerpiece of the Admongo campaign is a single-player online game in which users navigate everyday settings, searching for hidden advertisements. The project includes an accompanying curriculum, developed by Scholastic. While Admongo provides a fun new way to look at advertising in the classroom, it is lacking in meaningful engagement, as it doesn’t encourage students to critique or analyze advertisements so much as recognize them.
- United States Holocaust Memorial Museum: The United States Holocaust Memorial Museum launched an initiative designed to help teachers use its State of Deception: The Power of Nazi Propaganda exhibit to teach digital and media literacy skills. During January and February of this year, more than 300 English teachers from across the country participated in a digital and media literacy online workshop that introduced the exhibit and key media literacy concepts through a combination of webinars, lesson plans and online discussions.
- “Common Sense Media”:hhttp://www.commonsensemedia.org: Common Sense Media recently released a new K-12 curriculum focused on digital citizenship. According to the Common Sense Media website, this curriculum aims to “teach students to be responsible, respectful, and safe digital citizens.” The curriculum focuses primarily on digital ethics and responsibilities, using engaging classroom activities to tackle issues like privacy, cyber-bullying, online identities, and copyright/fair use.
- City Voices, City Visions: City Voices, City Visions, a program from the Graduate School of Education at the University at Buffalo, provides summer professional development institutes for middle and high school teachers. These sessions educate teachers on how to incorporate digital video into their classrooms in both interdisciplinary and subject-specific settings. Teachers use handheld digital videocameras and basic editing software to turn academic concepts into familiar video formats and work with the City Voices, City Visions team to create appropriate classroom assignments, evaluation rubrics, and sample videos.
At the Center for Social Media, we are using our examinations of how these projects are assessing themselves to inform the evaluation of a project the Center has been incubating: the Public Media Corps (PMC), a public media and community engagement initiative from the National Black Programming Consortium. A service corps model, the PMC aims to increase both broadband adoption and public media creation/use in underserved communities. Last year, 15 fellows worked with Washington, DC, community organizations and public media stations to create a series of engagement models, which combined media production, media access and civic engagement. CSM will be releasing a report on the results in May.
Evaluating Media Literacy Projects
As with public media engagement projects, digital and media literacy initiatives face a challenge when it comes to evaluating success. There are currently no standard tools for assessing baseline digital and media literacy skills — although in her white paper, Digital and Media Literacy: A Plan of Action, Dr. Renee Hobbs strongly advocates for their development. She notes that “there are so many dimensions of media and digital literacy that it will take many years to develop truly comprehensive measures that support the needs of students, educators, policymakers and other stakeholders.”
Because the initiatives we looked at varied so much in scope and size, each took a slightly different approach toward evaluating programmatic success. Not every organization we profiled implemented a comprehensive evaluation plan. However, many of them did, and some key themes emerged:
1. Set clear and ambitious goals, and assess against them: It is important that digital and media literacy initiatives move beyond “raising awareness” and move instead toward
empowering users to make their own meaningful choices, critiques and content. For example, Admongo does not go far enough in allowing users to evaluate and analyze the game’s advertisements, nor does it offer users much in the way of content creation. Successful digital and media literacy initiatives must set goals beyond awareness-raising, and evaluate their success based upon clearly-defined criteria.
2. Evaluate both media literacy and media production quality: One of the major tensions in evaluating youth and community media production initiatives is the extent to which media production values should be considered. Leah Clapman, director of the PBS NewsHour Student Reporting Labs, noted that, as the program progressed, program leaders moved away from evaluating the production values of student projects and towards measuring what students have learned in the process. City Voices, City Visions is able to negotiate this tension with a multi-pronged evaluation strategy. Students are judged in class primarily by how well they convey academic concepts through video, but an annual film festival showcases high quality student productions, as determined by external judges.
3. Evaluate both teachers and students: Staffers from almost every initiative we talked to expressed that feedback from both teachers and students is necessary in order to obtain a comprehensive understanding of how well a given project worked. Both Common Sense Media and City Visions, City Voices, for example, combined student assessments, teacher interviews and case studies. Dr. Suzanne Miller, director of City Voices, City Visions, stressed that evaluating teachers beyond the confines of teacher training institutes is key, as “not enough research follows teachers out of professional development institutes and into the classroom.”
4. Examine a variety of data: While most of the data collected in these projects was qualitative (a potential problem for some funders), it took many forms, including case studies, teacher and student interviews, and student pre- and post-assessments. Some of the data was collected through less traditional methods: the teachers involved in the PBS Student Reporting Labs spent a day in Washington, DC to discuss and debate the program and analyze strengths and weaknesses with external evaluators. Most of the programs hired external evaluators at least for part of the analysis, which helped to ensure depth of analysis as well as objectivity.
5. Share evaluation data with the field: Many of the programs are planning on publishing evaluation data in order to inform best practices. Common Sense Media plans on sharing video case studies on its blog. The Public Media Corps published a toolkit outlining the lessons learned from the program’s pilot year. This toolkit is designed for use by public media stations looking to implement similar programs but can also be employed as a general guide for community-based media programs. The Center for Social Media will also be working with PMC leaders to release a more comprehensive evaluation next month.
It is this last point — sharing information — that may be the most crucial for measuring the success of digital and media literacy initiatives. Developing shared best (and worst!) practices and lessons learned through smaller-scale media literacy programs will help to ensure the development of the field and the success of future programs.