i-09b149c8880d3b723fee1c115099c74d-KPBS Google Map.JPG
KPBS’ Google Map

When people think of community or hyper-local neighborhood news, they typically think of bake sales, petty crime and development catfights. But when a disaster strikes, the stakes for community news are raised, and lightning-quick news updates online can save lives and help residents cope.

That was the reality in San Diego and Southern California during last week’s series of wildfires, as mainstream radio, TV and newspaper outlets used the web — and social media — like never before. Local public broadcaster KPBS relied on Google to host its amazing annotated map of the area, while sending out updates via micro-blogging service Twitter. The ABC TV affiliate in San Diego, 10News, streamed its TV broadcast online for 38 hours — the first time it had ever streamed entire shows. And the San Diego Union-Tribune’s SignonSanDiego site used blogs to get info out quickly, while getting thousands of reader comments on articles, blog posts and in forums.

These local outlets also ran user-submitted photos and videos, as did national players CNN and MSNBC. However, even the best local coverage of the wildfires sometimes failed to provide the hyper-local level of information that many directly affected by the fire sought.

i-5e831286c1368174b0903dc122af867b-Rebecca Coates Nee.jpg
Rebecca Coates Nee

Rebecca Coates Nee, a former broadcaster who teaches online journalism at San Diego State, had to evacuate from her neighborhood for four days and relied on the Internet to find out if her home was still standing. She was disappointed that none of the local San Diego TV, radio and newspaper sites did a good job at answering her most basic questions.

“I think [the local media] lost sight of their mission, which is to inform the residents, especially those of us who evacuated and only had the Internet to get local information,” she told me. “The best thing would have been to have links to the info by neighborhood…I think [the media] did the best they could, but if they just could have taken a step back and asked, ‘Who is your audience? What do they want to know?’ I think they were just putting out updates from the county, but that didn’t apply to me. I just wanted to know what happened to my neighborhood, and find that information really easily.”

One place that did become a clearinghouse for neighborhood-level information — and citizen reports — was SignonSanDiego’s Wildfires 2007 forum, which has had more than 8,700 posts. Nee complained that the forum wasn’t prominently displayed on the newspaper’s website, and said it could have been better organized by neighborhood. Chris Jennewein, vice president of Internet operations for the Union-Tribune, said the forums were praised by users for helping them connect to give each other timely information.

i-f01bfd46c5a11a7937ca8ff78a28712c-Chris Jennewein.jpg

“The forums and blogs were very effective ways to disseminate detailed information, because readers could help each other,” he told me via email. “We did put a forum link as the third entry under the main blog, but navigation is always a challenge and many readers don’t know what a forum is.”

Public Service vs. Self-Service

What is perhaps most striking about the local coverage of the fires was the adaptability of traditional news outlets during a crisis situation. KPBS lost its transmission tower and was able to switch its audio feed to the frequency of another FM station. The websites for both KPBS and CBS8 were radically changed to one simple web page with links and important information for residents. Local radio and TV stations gave citizens live air time to call in their reports.

KPBS probably had the biggest shift to web technology, as the public broadcaster went from having one full-time person on its web staff a year ago to eight staffers during the recent wildfires. Those staffers took staggered shifts to make sure a crew was around 24 hours a day to keep aggregating information from the county and from locals to its Google Map of the area.

KPBS’ site received 36 times its normal traffic, bringing down the site and forcing the station to get out news via Twitter feeds, Flickr community photos, and by moving its map to Google for hosting.

“The technical difficulties we had forced us to rely on Google, Flickr and Twitter,” said KPBS online managing editor Leng Caloh. “Flickr was an easy way to tap into the community. Normally we try to get as many eyeballs on to our site, but in this case we were trying to get eyeballs off of the site. Because we were public broadcasting, it was only about serving the public, so there was never a question of whether we should keep the map on our site. I said, ‘OK, let’s just put it on Google,’ and that allowed people to search it and it became even bigger than we ever imagined it to would.”

People from Google stepped in to help host the KPBS map, adding bandwidth so more people could access the graphically heavy map. While Google’s hit count on the map ran past 1.3 million, Caloh says that number vastly undercounts the real traffic. By putting the map up on Google, anyone could embed the map onto their site, whether they were a casual blogger or a competing news outlet. That had a good — and bad — effect, according to Caloh.

“It’s good because the information got out there even further, and I saw one site that mashed [the map] up with our Twitter feed and the LA Times Twitter feed,” Caloh said. “That was in the spirit we had in putting it out there. But it also meant that there were media outlets that were using the map without crediting where it came from, so there will be fallout from that. There was a news report saying it was a map from Google, even though our name was on there, and there was an instance where someone stripped out our logo [from the map].”

Despite the crediting problem, online collaboration and aggregation played a huge part in helping smaller local outlets tell a complex and fast-moving story. KPBS online content producer Nathan Gibbs told me there were a lot of great ways that people worked together.

i-054746071d35ce09f9482abfbe1a3cd5-Nathan Gibbs.jpg
Nathan Gibbs

“Our radio reporters do excellent work telling people’s stories and gathering information,” he said via email. “Their reports, along with calls, blog comments, government sites and other news media helped feed Twitter and the Google Map. It was one intense collaboration that wouldn’t have been as successful without every element in play.”

The Union-Tribune’s Jennewein said that his newspaper had learned a lot from the previous 2003 Cedar Fire, and the print staff was more prepared to use interactive elements during this crisis.

“This time there was enormous cooperation from the print newsroom,” he said. “Print reporters were routinely calling in blog items instead of waiting to write a story. Photographers sent photos throughout the day. We enabled user comments everywhere on the site, and probably received tens of thousands…What we didn’t plan for was the enormous spike in traffic — two to three times the Cedar Fire — and had to install five new servers on the fly. We encouraged the community to post comments everywhere on the site and invited them to call in to our Internet radio station, SignOnRadio.com. We had 25,000 listeners on the second day of the fire.”

Building an Ideal System

For wildfire coverage, many local media had to evolve to serve an evacuated audience that could only access their information online. Plus, local TV outlets used content submitted by users more than they ever had before. 10News broke from its usual M.O. by streaming video online, taking viewer-shot photos and starting a Twitter feed. And NBC TV affiliate KNSD used the Veeker video messaging service to power their user submissions. KNSD received more than 2,000 photos and videos, posting more than 1,600 to its website.

But even though all these local outlets shifted their thinking during the crisis and saw how citizen contributions could give their coverage depth, they could have done even better. What if there was a unified Google Map that could accept data from various government agencies as well as news outlets which could vet user submissions? That’s something that KPBS’ Caloh would like to see in the future.

“It dawned on me that the information is all out there and there is a technical way to put it [together] in a searchable manner,” Caloh said. “That’s where we need to go, and pressure needs to be put on agencies to do that…Why not just let all agencies and individuals update the map and access it? Google told us they believed in what we were doing, so we might be able to get infrastructure support from them in the future. We tried to get the evacuation perimeter information from the county to overlay it, and there was some resistance because they were concerned with liability. But I thought people are already looking at this map for a resource so why not make it as accurate as it could be? There are some institutional shifts that need to happen.”

And perhaps it’s not just building the best interactive map or most updated Twitter feed. What many people who are evacuated really want is simple information on the condition of their home and neighborhood. Dan Gillmor, director of the Center for Citizen Media, believes media organizations need to ramp up their hyper-local coverage if they want to be relevant in the next major crisis.

i-56a988f28e36c618e586cf53ce675553-Dan Gillmor.jpg
Dan Gillmor

“If media organizations had a hyper-local strategy that — most importantly — gave people an easy way to tell each other what is happening in their own neighborhoods, they’d almost automatically be creating a kind of emergency information system,” Gillmor said via email. “What we need to see is a variety of experiments to see which ones work. I suspect it’ll be startups, not news organizations, that figure this out…But this is squarely in the sweet spot for traditional media if they understand their missions.”

Being a former local broadcaster, Nee didn’t want to be highly critical of news organizations during a crisis situation, but said she wished that the hyper-local info was better organized and accessible online.

“They got the human interest story down cold, that’s what I call formula journalism,” she said. “But for those of us who really wanted useful information from our media, we were having to hunt and peck for it, especially when we were outside the area.”

Perhaps the one medium that fared worst during the crisis coverage was the print newspaper. Nee noted the irrelevance of seeing newspapers in her driveway when she finally returned to her house in San Diego.

“It was sad…to come back Thursday after being gone for four days, and see the ash-ridden newspapers lying in my driveway,” she said. “I thought, ‘This is such a metaphor for where this is headed.’ I didn’t even bother [reading them], and took them directly to my recycling bin. I had already seen what there was to see online, and they never were as useless as they were then.”

What did you think about the online coverage of the Southern California wildfires? What impressed you and what could have been improved? How could hyper-local journalism serve the audience’s needs in a time of crisis? Share your thoughts in the comments below.

[Note: To check out more online resources, mainstream media efforts, social media sites and blog posts related to the fires, see our comprehensive list.]

UPDATE: Jeff Dillon, the editor at SignonSanDiego in charge of the forums, wrote in comments about some of the challenges in organizing the forums and getting that information to the public:

Structuring them by region, community or neighborhood would have been helpful, I agree. I think it would have taken a bigger institutional commitment (either more staffers or more empowered readers) on our part to make sure that structure continued to reflect the changing geography of the fast-moving, fast-multiplying blazes.

A mixed-blessing we had — mixed because I’m sure it was confusing to some readers — was the sheer variety of interaction mechanisms we’d added to SignOnSanDiego in recent years. In addition to the forums, we now have comments on stories (turned off for server-load reasons for the first few days of the fires), comments on photo galleries, comments on our original Movable Type-based newsblog reports of the fires, comments on the Blogspot-based blog set up for us by Google when we had to stop using the overwhelmed MT-based blog, comments on user-generated photos hosted by Vmix.com and comments posted on some of our Drupal-based community sites. Users were making comments, posting questions and offering answers about the fire on all of those systems — as well as sending us e-mails — making it a challenge to get people the information they were looking for.

Perhaps what’s needed is a dedicated online editor who is going through all comments and reader-submitted material to come up with a unified view of the fire status in neighborhoods. In other words, finding the pertinent comments and putting them in perspective — while also making the most important material easy to find for other readers. Traditional media will have to take their role seriously as aggregator and moderator for citizen contributions, or others will step in to take their place in the communities.