• ADVERTISEMENT

    How Blocking Search Engines Can Increase Ad Click-Throughs

    by Joe Boydston
    April 16, 2010

    Search engines, RSS feeds and content aggregators make a reader’s life easier by providing new ways to scan for articles and to discover news. One result of this is that readers may no longer feel the need to regularly visit their local paper’s website in order to stay informed about the goings-on around town.

    Following this logic, publishers work hard to make their content as searchable as possible, to make it accessible outside of a newspaper website. Conventional wisdom dictates that websites should be optimized for search engines.

    But what if your content is very specific in nature? Suppose that you have a respected brand, and that people in your community look to you to provide information that is relevant to them? When newspapers give their readers alternative ways to access their information, they are gambling that the a la carte traffic coming back from the search engine will more than make up for the loss of direct traffic they previously received.

    ADVERTISEMENT

    The theory goes that the easier your content is to find, the more traffic your site will receive. But a recent experiment by a few newspapers in Northern California suggests there’s value in keeping come content away from search engines and aggregators.

    Papers Prevent Search Engines, Aggregators

    During the first quarter of 2008, three small newspapers in Northern California with website pay walls edited their robots.txt files to disallow search engines and aggregators from indexing any content on their websites. I am vice president of digital media for the newspapers in question. I run web strategy, sales and operations for dailyrepublic.com, davisenterprise.com, and mtdemocrat.com. We made the change when local advertisers started buying Google AdWords instead of ads on our website. Realtors, for example, buy the keywords “Fairfield Real Estate News” and advertise on our content through Google, which is not good for us.

    As a result, management at the papers decided to cut off search engines and aggregators. You can view some of the results here:

    ADVERTISEMENT

    i-912e2aba38ee64b5eb8d98e36bc5cc2d-boydston-figure1-thumb-500x306-1570.png

    As the charts above illustrate, website traffic has grown steadily in each of the four key metrics we studied. What was most surprising, however, was the impact this change had on our ad-serving effectiveness. The click-through rate for ads rose from a modest 0.29 percent in 2008 to an average of 2.87 percent today on paid access pages. (You can also view some related data here. It compares paid and free websites of similar size.)

    It appears that for these papers, traffic volume alone does not impact click-through rates. What I’m suggesting is positive correlation between increased reader frequency and the click-through rate. Frequency is key to generating advertising response. Simply put: Newspapers who give their readers too many ways to read their content may be inadvertently destroying the advertising effectiveness that sustains their business.

    I am not trying to convince you that every website should block search engines, or that newspapers should all try pay walls. But I implore the news community to consider that it is plausible for a news organization to thrive without search engine traffic.

    It’s a concept that stirs up emotional responses from many in the news industry — but it deserves more logical contemplation.

    Reblog this post [with Zemanta]
    Tagged: advertising click-through rates newspaper pay wall search engines seo

    8 responses to “How Blocking Search Engines Can Increase Ad Click-Throughs”

    1. I’m intrigued but I feel like you are only telling part of the story. The traffic grew but what type of traffic was it? Are the improved metrics due to successful campaigns in other forms of marketing?

      Did you see an immediate bump in the first month you blocked search engines?

    2. I see the point about the positive effect on advertising. I’ve always said that trying to be all things to everyone was no longer a model we could follow online.

      The statistics are very interesting, especially coming from a company that I convinced to tear down their paywall.

      For us, in less than 18 months we have quadrupled our traffic, so things can go the other way as well.

      Of course, I can’t say if the resulting traffic has provided a lot more value to local advertisers despite giving us much more inventory.

      It’s often hard to tell what the long term effect will be.

      I believe that providing high value for advertisers will pay off in the end even if lower value to a larger reach can bring in more dollars in the short term.

    3. I’m not sure what you mean by type of traffic, but we did not make any other significant changes to the websites at that time. What I do know is that it is a GOOD kind of traffic, the kind where ads get clicked on, and we have happy advertisers. We had a national fast food chain ask us to slow down their ad rotation, as we were costing them too much on their ad serving.

      We do zero online marketing for these website, they are typical small community newspapers.

      There was a 7% dip in traffic when we first shut off the aggregators. We still get search traffic from Google, but the keywords are generic like “fairfield news” or “fairfield newspaper.”

      The Quantcast stats I linked to put this in perspective I think, the UNIQUE visitors on our private/paid site are way lower than the comparable free site. But Pageviews per person are higher on the private/paid site. Implies that one or both are true:

      1. Readers are more engaged on the private/paid site.

      2. Unique visitors on free sites are grossly overstated.

    4. Hi Joe, Thanks for sharing your data and your experiences – it certainly gives me pause. (I also appreciate the lack of stridency, folks tend to get down right religious about their views on these issues.)
      To me the most interesting stat by far is the click-thru rate increase. That’s a powerful, important metric.
      I only wish you’d left ONE of your three papers alone so you have a kind of ‘control’ for your experiment.
      Two questions:
      Have you compared your rise in pages, sessions, time, with industry-wide rises? News sites overall have been rising in all those metrics the past 3 years, reflecting an overall increase in use and adoption of the ‘net by the general population.
      Two: re the undeniable higher engagement – doesn’t this point to the fact that you’re reaching a higher ‘quality’ of reader, i.e. a reader who is local and reading you, not because of a specific article they’re searching for (one look and on to the next thing) but rather because they’re coming to you to see what news you have?

    5. Ryan Thornburg says:

      Hi,

      This is interesting. I’m curious:
      * What % of your site traffic came from Google before and after the change?

      * Was the impact different at the registration-required Solano paper than the other two subscription required papers?

      * Related to that, have you always had the same subscription/registration requirements for your papers as you do now? I’d be curious to hear more about that.

      * Do you sell click-throughs or impressions?

      * I wasn’t sure why frequency would increase click-through rates? Have you been seeing some patterns that indicate people are more likely to click on an ad after they’ve seen it 2 or 3 times?

      * Why do you say that “unique visitors” would be grossly overstated?

      Thanks again for sharing your insights.

    6. @Bill Dunphy – You make a good point, most newspaper websites experienced traffic growth during the time period we studied. So we cannot conclude that blocking aggregators is a strategy for growth. But I will argue that blocking aggregators does not necessarily mean the end of your website traffic. Website marketing is a complex and nuanced business, and deserves our careful attention to strategy.

      On your second point, I also agree. Based on the ‘pages per visitor’ metric, we see that average subscribers to our site is highly engaged. But the reason for the high CTR is likely the fact that we eliminate the non-local traffic that bring the CTR average down.

      Our website visitors are almost ALL local, and more likely to find our local advertising relevant to them. So in theory, out of town visitors to our website will actually drive down the CTR as they are very unlikely to click on local ads.

    7. @Ryan Thornburg – All three sites are technically identical in terms of paywall, registration and structure.

      7% of google inbound traffic came in as a result of keyword query. (users looking for specific news items) The remaining 12% of google traffic was generic search terms like “fairfield newspaper” or “solano county news.” We still receive this generic traffic from google, in fact google referral traffic is up to 14% at one of the sites. All based on generic keywords, no inbound links to individual stories. Bottom line, initial loss of 7% of traffic, but quickly regained in the first year.

      We still sell advertising based on CPM or monthly commitments on web ads. The primary reason we don’t switch to selling clicks is that most local merchants are not able to quantify the value of individual visitors. (They not very good at converting clicks to sales)

      My assertions about frequency are based on the conventional wisdom in the advertising business that people are more likely to act when they view the advertisement more frequently. This is a key concept in Newspaper print advertising sales. Our study was not intended to reveal insights into advertising effectiveness, so the results are speculative on my part. But I do see the ads with exclusive position have higher CTR on average. (Leaderboard ads that don’t rotate get more clicks than ads that rotate through out the site.) To get the best CTR, advertisers should deliver their message consistently, to a targeted group of people.

    8. These results make perfect sense. The remaining traffic to your site is primarily subscribers, who obviously already feel your content is valuable and are likely to read multiple stories.

      First, I think that CTR, Pages per Session, and Avg. Time on Site are not appropriate metrics to compare. By removing natural search visitors (search engine visitors of course being more prone to checking out one page and then leaving the site, and less prone to browse around and click ads), you obviously will improve these metrics by decreasing the denominators used in calculating these metrics.

      Secondly, I believe the growth in sessions per day or pageviews per day can be attributed to the normal migration of newspaper readers from offline to online. If we looked at numbers for every quarter (not just Q1-to-Q1 comparisons), I think we’d see a clearer steady upward trend.

      In my opinion, ad CTR is an indicator of success with your current traffic. By cutting off certain segments of your current traffic, you are manipulate ad CTR with a heavy hand. I don’t think you’re doing much harm by removing that 7% coming from natural traffic, but the only good you’re doing is in fudging the numbers.

  • ADVERTISEMENT
  • ADVERTISEMENT
  • Who We Are

    MediaShift is the premier destination for insight and analysis at the intersection of media and technology. The MediaShift network includes MediaShift, EducationShift, MetricShift and Idea Lab, as well as workshops and weekend hackathons, email newsletters, a weekly podcast and a series of DigitalEd online trainings.

    About MediaShift »
    Contact us »
    Sponsor MediaShift »
    MediaShift Newsletters »

    Follow us on Social Media

    @MediaShiftorg
    @Mediatwit
    @MediaShiftPod
    Facebook.com/MediaShift