• ADVERTISEMENT

    The Problem with Web Measurement, Part 1

    by Mark Glaser
    July 25, 2007

    i-f95ce04b9136cba02beec1598d856c92-IAB web measurement.jpg

    On April 19, 2007, the new CEO of the Interactive Advertising Bureau (IAB), Randall Rothenberg, sent a scathing open letter to the heads of the major web measurement firms, comScore and Nielsen//NetRatings, complaining that they better get their act together:

    Imagine my surprise when I came to the IAB and discovered that the main audience measurement companies are still relying on panels — a media-measurement technique invented for the radio industry exactly seven decades ago — to quantify the Internet…To persist in using panels that undercount or ignore the diverse populations that are the future of consumer marketing is to deny marketers the insights they need to build their businesses. And it certainly appears to us as if they are being undercounted or disregarded, for our members’ server logs continue to diverge starkly from your companies’ sample-based assessments, by [two times to three times] magnitudes in some cases — far beyond any legitimate margin of sampling error.

    Rothenberg called on the two measurement firms to submit to full audits of their methodologies, while various web publishers, media companies and advertising agencies cheered him on from the sidelines. Their rallying cry: It’s about time. The web has been around for more than 10 years as a medium, and it’s been called the most measurable medium in history. Yet, web publishers at large media sites such as CNN.com or NYTimes.com have been dismayed to see that their direct counts of web visitors measured from their servers vary so wildly from the counts of panel-based firms.

    ADVERTISEMENT

    Why does that matter so much? The promise (and problems) of web measurement have been around since the web’s inception, but only now are large advertisers and marketers truly betting the farm on the medium, and moving millions of dollars away from traditional media such as TV and newspapers. Measurement numbers are used by website publishers to sell ads, by ad agencies to satisfy clients, and by third-party measurement firms who package those numbers into high-priced reports.

    Complicating matters is that there are actually two cottage industries around web measurement: the panel approach led by comScore and Nielsen//NetRatings, and the “census” approach used by web analytics firms such as Omniture and Visual Sciences that take numbers directly from a website’s servers. Not only are the panel and census numbers inconsistent, but even numbers within each sector come out cock-eyed and varied. So one site might get a different set of “unique visitors” from each firm doing the counting.

    i-5ad479b03c39f443c838853c6d0d14c3-Eric Peterson.jpg

    ADVERTISEMENT
    Eric Peterson

    “It’s ironic that we have all this data, but having the data and knowing what to do with it are completely different,” said Eric Peterson, a consultant and author of Web Analytics Demystified. “Across those two industries [panel and census firms], there are no standards. No standard way to measure visitor interaction, engagement. Page views reported through the Nielsen service are almost always going to be different than page views measured through WebTrends. They ostensibly describe the same thing, like you or I going to visit the home page at PBS.org. But there’s no adhered-to standards for what that data means, and there’s a lot of data. It ends up being more confusing than not for most people.”

    It’s especially vexing for marketers who only have to deal with one ratings system for TV (from Nielsen) and one ratings system for radio (from Arbitron). The Internet has engendered a multiplicity of ways to measure activity, mainly because people do so many things online, from reading text blogs to watching Flash-based video to embedding widgets on social networking pages. Each activity seems to birth another metric, with each firm coming up with their own patented way of counting it.

    “The Internet is not just media — you also read blogs, you read email, you buy, date online,” said Mainak Mazumdar, vice president of product marketing and measurement science for Nielsen//NetRatings. “Media is there but it’s only a small part of it. There’s a wide range of activities going on. Taking it and saying it’s just like any other medium is limiting ourselves. We have to look at it with an open mind.”

    i-3babc4f52515d427b4706771d1404274-Mainak Mazumdar.jpg

    Mainak Mazumdar

    To make matters worse, there are obstacles thrown at measurement firms each day to keep them from getting accurate counts. While Nielsen//NetRatings has a panel of more than 2 million people worldwide who have special software installed on their computers to monitor what they do, it’s not a perfect system for extrapolating the entire universe of web users, who can visit billions of possible websites. Those smaller sites along the Long Tail are rarely counted by panel-based firms despite the growing importance of niche sites catering to people with specific interests.

    And the census side of the equation has problems as well, even though the numbers are coming right from computer servers that build pages for web visitors. One of the biggest problems is that many census systems rely on “cookies,” code that helps identify visitors but that people often delete from their web browsers because of privacy concerns. A recent study by comScore (which echoed a previous study by Nielsen=) showed that cookie deletion led to server-log counting that overstated unique visitors by as high as 150%. Plus, census counts must deal with automated search spiders, spyware traffic, and the dual threat of multiple people using one computer or one person using multiple computers.

    Can a ‘Beautiful Fantasy’ Come True?

    The IAB, which represents media companies and online marketers, has tried to create standard ways for measuring web activity. Before Rothenberg sent his open letter, the IAB had made an effort to get the various measurement firms to submit to audits of their methods, to no avail. While Nielsen//NetRatings did a pre-audit, it was dragging its feet on a full audit — until Rothenberg unleashed his open letter heard ‘round the Net. Not long after, the panel-based measurement firms decided to submit to full audits from the Media Rating Council (MRC), a non-profit set up by Congress in the ’60s to help the TV and radio industries set up certified measurements.

    The open letter articulated in eloquent terms what so many publishers and advertisers had been swearing about for years: Why should they pay so much money for research data that seemed so flawed? Because advertisers have a history of trusting third-party panel data in other media, they also rely on panel numbers online instead of the census data from publishers. That means publishers have to accept the panel data as a means of comparison in the marketplace — even if they don’t trust the accuracy of the numbers.

    i-0252380f066143d2c76f0fd0236b96d0-Sheryl Draizen.JPG

    Sheryl Draizen

    “I think for the first time [the open letter] articulated the pain that people were feeling,” said Sheryl Draizen, senior vice president at the IAB. “It really was an issue that should be addressed and something needed to happen here. It was an aggressive move on our part. We had approached comScore and Nielsen a number of times and hadn’t got a response. Our board was unanimous about doing the open letter because they felt that we weren’t being heard. It had always been a peripheral issue, and this articulated it in a way that had never been done. It is a big issue, and we have a lot of attention on us now.”

    The IAB and MRC have high hopes that they can audit the processes of both census-based and panel-based measurement firms, as well as the way publishers themselves count web traffic, and help bridge inconsistencies with standard ways of counting.

    “A friend of mine described it as the most beautiful fantasy…but it would never happen,” consultant Peterson said. “Omniture has a $1 billion market cap, and I don’t see Omniture tearing apart their technology to calculate unique vistors and page views differently because all their competitors have decided there’s a different way to do it. It’s hard to imagine. Not impossible. Fantasies sometimes come true.”

    The MRC is confident that it will set standards for web measurement in a fair way, and that they will be accepted by the industry. MRC executive director George Ivie ticked off all the web standards being developed by his group, while also shepherding Nielsen and comScore through the auditing process. He said they were working on a standard for clickthroughs to help put a solid number on click fraud, a problem with pay-per-click ads where automated programs or groups of people run up the ad’s cost with multiple clicks. The MRC is also working on standarding the measurements for unique visitors, video viewers and even AJAX applications such as Google Maps that don’t create a trail of page views.

    Though the group doesn’t compel companies to submit to audits and certification, or follow the standards set, the MRC has had success in the past changing the way ad impressions were counted with new guidelines in 2004.

    “When standards were set a lot of people had to change their methodology,” Ivie said. “They went from server-side counting to client-side counting. People had been counting ads when they sent them from the ad server to your browser, not when you received them on your browser. When we changed the standard in 2004 to count at the client side, that forced a change in counting…That greatly enhanced accuracy and forced everyone to change, and everyone did.”

    The MRC has taken up an ambitious three-phased “reconciliation project” to try to eliminate inconsistencies with gauging ad campaigns from various third party firms such as DoubleClick and with publishers like Yahoo. The scientific study goes from serving ads onto private web pages all the way to tracking past ad campaigns to account for human error. Ivie said the MRC would test out 340 different types of ads and how they are counted by various third-party ad servers as well as publishers.

    Eventually, the MRC would like to audit the methods of all interested third party measurement firms and larger web publishers. But Peterson, for one, believes that getting industry-wide participation might remain a fantasy. When I told him about the MRC’s plans, this was his stark response via email:

    Interesting news from the MRC, but the fantasy is not knowing what the problem is (via audit); the fantasy is having the vendors actually do something about the differences…My 4-year-old is pretty sure she saw Santa Claus last year too.

    Keeping Up with Shifting Technology

    Another issue for the MRC and IAB — as well as the measurement firms and publishers — is trying to measure a medium that is constantly changing with new technology. At one point, people measured a site’s page views as its gold currency for the number of ad impressions it would serve up. But now a typical web page on, say, the OrlandoSentinel.com has banner ads, floating rich media ads that bounce around the page, video ads that play before news video, and contextual text ads served by Google. Each one has a different measurement methodology, rendering simple counts such as page views irrelevant. Page views have been the scorn of the industry because news sites get more page views by cutting up stories into multiple parts and making people click incessantly to get what they want.

    Nielsen//NetRatings recently caused a stir by announcing a new metric, “time spent,” that would be more helpful in gauging the dynamic video and social media sites. (See the company’s release in this PDF Acrobat document.) Many news outlets interpreted the news as the death knell for page views, but many analytics bloggers howled that “time spent” had its own inherent problems, because sites would now slow down their loading time to increase time spent. As with page views, sites could be rewarded for poor navigation or a bad user experience.

    Nielsen//NetRatings’ Mazumdar told me that there was a misunderstanding about page views and that the media had overreacted to Nielsen’s press release.

    “We said page views have been very important to us,” he said. “People misread our release and thought we said page views were being replaced but we never said that. We still report page views just like we have for 10 years or so, but we’re adding another metric called time spent. If you start delivering your blog in an AJAX environment you would lose page views but then you might use time spent and find that people spent 30 minutes on each visit. With TV programs online, that advertising is by time, 15-second or 30-second commercials, and no one knows how that advertising model will work, but we know it will be time-based. It was a shift, not from perspective, but to incorporate new technologies that the content is being delivered to the audience.”

    But measuring time spent could be difficult because someone might be visiting a website, leave the window open, and go to sleep for the night, giving them hours of time spent at the site even though they were asleep. Mazumdar says that Nielsen solves that problem with its patented monitoring software that resides in the computer’s operating system.

    “We have a rule called ‘in focus,’” he said. “If you have multiple windows open, you only have one that’s in focus. When other windows go out of focus, our software picks that up. So if I’m reading your blog at 11:30 pm and I stop after 25 minutes and leave and go to sleep, we assign 30 minutes as a session, and after that we check to see if you are in focus or the window is asleep and cut that off. Our time spent will be significantly less than Visual Sciences or Omniture or even comScore because they don’t have the same software to gauage the time spent in the same way…We can pick up mouse movements and typing speed, and we monitor the whole computer to be more precise.”

    But even as one firm tries to figure out how to measure engagement on web pages, new technologies will continue to proliferate — especially on social media and Web 2.0 sites that employ embedded widgets with multimedia. comScore announced it was beginning to rank web widgets, while startup Quantcast — which combines census and panel data — said it would gauge traffic for widgets including Flash-based video.

    So how can the MRC and its slow-moving standards-setting process keep up with the pace of innovation? Ivie says the group can only move so fast, and can’t possibly cope with every single new technology that puts a crimp in web measurement. Instead, the MRC uses a kind of psychological warfare with firms it is auditing, using the threat of oversight to make sure those measurement firms stay transparent with their customers.

    “The reality is that we don’t audit every day, we audit periodically,” Ivie said. “There are changes that go on with measurement services every day and we choose when to go in to audit. The best thing about the audit is that it instills in their mindset that they have to inform their customers about what they do. So even if I’m not there today, if they know I’m going to come back in two months to complete my audit, and they know I’ll report what they did, there’s a benefit to that in the marketplace…I don’t think we can possibly keep up with all the innovation happening on the Internet. We did an ad impression standard in 2004, and have already had to add to that a rich media standard, a broadband standard and an AJAX standard as add-ons. We didn’t anticipate it but we write standards as soon as we can after these things come out. We do the best we can.”

    The IAB’s Draizen said that her group would try to issue interim guidelines to move things along a bit quicker, while still being comprehensive in setting guidelines.

    Time will tell whether these efforts at auditing measurement firms while setting standards for measurements will be enough to get entrenched players to change their ways. Plus, the complexity of web measurement and new technologies makes it nearly impossible for a one-size-fits-all measurement to become the currency of online ratings. The best hope for people in the online marketing arena is to try to understand why the numbers are different from all the various firms and methodologies — rather than expect a quick fix.

    Note: Next Wednesday, MediaShift will continue this special two-part series about web measurement by hearing from website publishers and advertisers, to find out what numbers they trust. I’ll also gauge the chances of a unifying web measurement, and look at upstart company Quantcast which combines panel and direct server traffic counts in one metric. If you have thoughts on this subject, please use the comments below or the Feedback Form to submit them, and I will try to run a compendium of feedback in a future blog post.

    Tagged: business traffic web measurement websites

    4 responses to “The Problem with Web Measurement, Part 1”

    1. This is a great analysis. I followed up our rant with a post about what advertisers should be measuring. If I can ever be of additional help please feel free to contact me.

    2. My understanding is that Omniture does not, in fact, use server logs. My experience has been that it uses javascript to count actual page views. Thus, it doesn’t count bots, spiders, etc., that might otherwise show up in server logs (yes, I realize there are other ways to filter them out, too). One issue is if a user has javascript turned off, but my experience has been that this is less than 1% of users. I have found Omniture to be extremely accurate, even if it isn’t always easy to drill down into some of the stats I’d like to see.

    3. It seems very strange that panels would be considered useful at all for Internet studies — haven’t they read “The Long Tail”? Don’t they know how people use the Internet?

      Maybe if you wanted to know about a very specific demographic, such as “men over 60 who have survived x type of cancer,” and you wanted to know how that exact group uses health information on the Web.

      THEN a panel would make sense. But otherwise? What are they thinking?

      Good call-out, Mark.

    4. sachen says:

      Thanks for this good information. I agree that panels aren’t the way to go, not to get general information. They are also expensive! They are interesting to get some nuggets of information, but again not to get a good understanding across the board.

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift