This piece was co-written by Alexa Capeloto.
A couple of days after news broke of Osama bin Laden’s killing in Pakistan, a group of students at John Jay College of Criminal Justice, where we teach journalism, sat in a classroom and talked about how they were first alerted to the story. Most said Facebook. Some said friends or family, primarily via text message. No one named a newspaper. One student, Josh, said CNN.
CNN? So Josh just happened to be watching cable news late on a Sunday night when the bin Laden story broke?
“Oh. No,” he said. “I heard about it on Facebook, then I turned on CNN to find out more.”
In these days of social media, it was surprising that Josh didn’t give Facebook due credit.
After all, the discussion was about the first source, not the best. Did seeing comments on his status feed not count as information delivery in the same way a CNN report did? Was it not real for him until a traditional news outlet confirmed it?
We’re used to our peers and mentors privileging legacy media — be it broadcast or newspapers. But this is not what we expect of today’s college students, a.k.a. tomorrow’s journalists. In their wired world, there are increasingly fuzzy distinctions between professional and citizen, fact and rumor, confirmed and unconfirmed. We see their iPhones and Androids, iPads and laptops, and we figure part of our job as journalism instructors is to call attention to those distinctions. Yet, as Josh’s answer suggests, students might be overcorrecting toward the old school, and in the process psyching themselves out of the journalism game.
Marrying the digital revolution to journalism
We consider this tendency the “digital divide 2.0,” an updated version of the gap that long existed between those who could afford pricey personal computers and dial-up Internet connections and those who could not. Despite the growing affordability of Net-based personal technology, the basic class disparity still exists among our students. Now this new version of the divide adds a psychological dimension that cuts across class lines and might be harder to define, diagnose and fix.
Although our students know how to act the part of digital natives, they’re inclined to see the Internet as a tool for entertainment and socializing, rather than as an information source. Facebook is for photos and “status,” YouTube for cute or crazy clips to pass along to friends, and the rest a treasure trove of music, movies and TV shows (unless, of course, that history paper is due tomorrow and they need to visit Wikipedia).
Despite all the time they spend online, they’re behind the curve in terms of understanding the journalistic potential of social media. In fact, some of them are reluctant to recognize the connection between legacy media and web 2.0, as if in doing so, they’d be assuming a power best left to professionals.
When our recent crop of digital journalism students were asked to create their own journalistic blogs and market their content through social media, they were uncomfortable. Although they habitually post to Facebook, the thought of actually reporting on a topic and putting their work into the public domain as journalism, versus a personal narrative of candid pictures and random Friday night ephemera, was scary.
In fact, a few students said that they didn’t see blogs as journalism, because anyone could do them. They were in class to learn about reporting and writing — capital-J Journalism — and not to repeat what they already do on their own time.
When one of our colleagues at John Jay published a widely circulated Op-Ed in the New York Times in March suggesting, perhaps polemically, that students be taught to write Twitter feeds and YouTube captions in composition class, our students were more horrified at the thought of bringing those activities into the classroom than many of their professors.
In some regards, it’s refreshing that students already know what we think we’re supposed to teach them. There is a difference between what they post on Facebook and what they see on CNN. Not anyone can do journalism, or at least do it well. It does take time and training and some hard lessons to become responsible, thoughtful purveyors of information.
But no one ever gets to the point of responsible purveyor if they are too scared to test their capabilities as reporters, or too conservative as readers to trust beyond the mainstream media. If students can’t see that there’s journalism lurking in the everyday things they do with information, especially now that technology has made such things constant, instant and ubiquitous, then we truly do have reason to worry about the future of journalism — particularly if the original digital divide is still a factor.
A new digital gap emerges
The digital divide reared its head this semester when one of our strongest journalism students said he wanted to sign up for an online section of Intermediate Reporting, but he was afraid to because he didn’t have Internet access at home. During the summer break, the editor-in-chief of the student newspaper can’t access the paper’s new website for the same reason.
“If I did have the Internet, what would I use it for?” he said.
If students who know, own and regularly access technology aren’t inclined to put it to journalistic use, then what of the students who don’t have such access? Not having the Internet at home — or perhaps having parents who don’t possess the time or means to demonstrate the web’s legitimate capabilities — pushes some students even further
back in the march toward careers in journalism.
The digital divide 2.0 is a psychological and sometimes economic divide, but it’s also a generational one. When we started college in the early ’90s, the library or the campus lab was the prime source of connectivity. As a consequence, we conceived of the Internet as a tool for doing work and getting information as we would on an old-fashioned terminal-based database or card catalog, or we used it to read primitive newspaper homepages.
When connectivity comes quickly and easily via intuitive mobile devices, and when the web becomes more about entertainment than information, then the associative power of Internet and workspace is undermined. Go to any college library now and count how many screens are on YouTube, Hulu or Facebook for purposes that have nothing to do with news or research.
As for Josh, it’s possible that he overlooked Facebook because it has too much power, not too little. He may not see it as an information source because it’s so ingrained in his world, such an extension of the self, that he doesn’t see it as an external source at all. Like the air around him, it’s so essential that it doesn’t need to be acknowledged.
But how can students properly examine and harness the journalistic potential of digital media if they don’t even see it as media, and how can they become content creators if they don’t believe their content counts?
In addition to teaching nuts-and-bolts journalism, these are questions that we need to consider as we prepare our students to be media producers and consumers in the 21st century.
Reporter’s essential tools photo by Valerie on Flickr.
Alexa Capeloto and Devin Harner are assistant professors of English at John Jay College of Criminal Justice/City University of New York where they direct the journalism program. Alexa earned her master’s degree at Columbia’s Graduate School of Journalism, and spent 10 years as a metro reporter and editor at the Detroit Free Press and the San Diego Union-Tribune before transitioning into academia. Devin has a Ph.D. in English from the University of Delaware and a background in journalism. His recent work has included essays on Chuck Palahniuk’s non-fiction; on the film Adaptation’s relationship to Susan Orlean’s, The Orchid Thief; and on virtual time travel through YouTube.