Google Webmaster Tools Killed the CTR Study

This might be a bit of a rant.

I’ve long felt CTR studies were not comprehensive enough to tell the complete story of user behavior, but unfortunately that’s all we’ve had to go on. Although I am a huge proponent of market research I’ve found the shortcomings of CTR studies to be too much to shake. After all we’re an entire industry convincing people to make considerable investments for 18.2% of search volume? When you think about that it seems like a no-brainer to flip the Paid Search switch when we continue to say PPC gets 20-25% of the clicks. However over the past few days all of that changed.

 

Keywords (now provided)

Unless you’ve been ignoring most SEO conversations as of late you know that (not provided) has taken up much of the bandwidth since someone realized we’re fast approaching 100%. Aside from Matt Cutts’ bragging about how he’s trying to break the spirit of Black Hats and a few sites getting banned it’s been a pretty slow SEO news cycle.

Despite that many people tentatively predicted that Google Webmaster Tools would be the place that that data would ultimately resurface.

They were right.

Go ahead and dust off Google Webmaster Tools. I know you haven’t logged in for a few weeks. As long as you are on top of your on page game it’s really not that actionable. I know. I’ve been hard at work on a new toolset trying to figure out how to leverage the data creatively. There hasn’t been much that I would care about on a daily basis.

But I digress.

In the Search Queries (it’s under Search Traffic on the left. It’s ok. Like I said I know you never come here) section you can now see keywords versus landing pages in the Top pages tab.

Yes, Google gave you back your keywords.

How accurate is it? Well with Google Analytics sampling becoming more of an issue it’s hard to tell on the high volume sites that I have access to, but I’ll grab my site and some data from a bigger site to check against.

The only thing the review of my site reveals is that I need to start posting more, but for arguments sake GWT is off by an average of 4% for the URLs I was able to pull.

Now for a bigger site that gets a good amount of traffic. I’ve pulled 50 URLs and compared them only to find out that GWT is off by an average of ~17%.

My sample sizes here are obviously quite small. This would be worth someone examining in more detail across a more sites for far more URLs than the 16 and 50 I’ve looked at here, but despite the cute little note below you shouldn’t expect the same accuracy you’re used to out of this new report.

Roll Your Own CTR Model

You already know my stance on (not provided), so while this added data is really cool, it’s not really what I’m excited about. What I am excited about however is the Top queries section that allows me to export data that looks like this:

That’s right, for the first time ever Google is giving us exact CTR data per the position that a given keyword has appeared in. In one fell swoop Google has rendered the various CTR studies completely useless. From now on we have the ability to develop CTR models based on our own sites rather than models like this:

The problem with existing CTR models is that it’s impossible to have a sample size big enough to make a strong enough conclusion. Every vertical is different. In fact every SERP is wildly different with the inclusion of knowledge boxes and rich snippets so it’s pretty reckless to build predictions off of these industry standard models.

What would be ideal is for an analytics company like Moz or Conductor or even Experian who sniffs keyword traffic from ISPs and such to build a series of far more robust CTR models based on millions of sites segmented by vertical. I’ve even kicked around the idea of building something called the FreeRankingsProject.org that allowed everyone to track unlimited rankings in return for anonymized access to their analytics data… but, again, I digress.

All of our hearts have been in the right place in building the sample-deficient CTR models so, but now we can build our own models from this data.

Unfortunately, Google makes the data pretty difficult to get at scale, but the general premise has you collecting your data from the bottom of the individual Search Queries pages and creating something that looks like this.

That in turn gives you your own CTR curve that you might find is far steeper than the curves you’re used to looking at.

In fact from the spot checking I’ve done across various Webmaster Tools profiles while writing this I’ve noticed that most #1 spots are reported as being far above 18.2% (Slingshot), 36.4% (Optify), etc. However at the end of the day you’ve got to talk it all with a certain grain of salt. After all we are talking about Google. ;]

Like I said it’s a rant, or quick brain dump after checking out a new feature. I figured I’d dust off my blog so I break this writer’s block!

Hat tip to Cyrus Shepard for tweeting a screenshot that led me to check out these new features.

So what do you think about this new data, CTR studies? How should we handle predicting success? Industry standard CTR studies or roll our own?

**UPDATE: The folks at Branded3 already developed a method for calculating true CTR from WMT data a few months back. Check it out!

  1. David

    So with today’s GWT update to offer more accurate clicks/impressions numbers do you think that is easier to do at a client level? I picked a few random days over last week and it still appears to be around 60-70% of what GA is tracking as “visits” so maybe it could be easier now but would still be at a client/industry level but country specific insights.

    I use an average of the 3 CTR studies and it seems to work well enough until they are ranking organically for the target terms and then you can use GWT data to make refine it for that individual client. The problem is the many different versions of Google UX for each country can skew the numbers along competitive nature of that keyword. Some terms are flooded with universal search and adwords results and other terms just have the basic 10 page of results.

    The big curve ball is conversion rates as I’ve seen conversion rates change for organic keywords based on ranking positions. I think none of these studies really cover this area in detail and this is probably more important as which traffic is great know but being able to estimate revenue will get that buy-in.

    But my final thought is that while industry specific studies are useful the most accurate ones will be the internal studies that may never see the light of day….

    • ipullrank

      Yep, it looks like they are giving more interesting CTR data at the client level at least. Not sure about accuracy because we no longer have any type of end to end data to judge that against.

      I think you’re approach is solid and it can be refined by this and GA data to build more sound models for prediction.

      As far as the conversion rates being different per position I completely agree. I’ve done some research and some dynamic targeting based on this back when we had keywords. Check out: http://www.zazzlemedia.co.uk/blog/dynamic-messaging-based-on-ranking-to-improve-conversion/ Basically users are sometimes in a different position in the user journey based on when they click on you.

  2. Dan Shure

    Hey Mike – nice post. I have to admit I’m thoroughly confused though, I’ve been seeing this data for forever in WMT (if we’re referring to “top pages” and click on each page to get keywords per page). Unless you’re talking about the new message I see “an improvement to our top search queries data was applied retroactively on 12/31/13″ which their documentation page is unclear as to what that means.

  3. Jan-Willem Bobbink

    Would be more interesting to see the differences compared to data in Google Analytics for example. I did a quick comparison with actual Google Analytics data, of course with estimations based on average not provided percentages, but based on 304.799.510 impr., 17.372.911 clicks from 12 domains I get a completely different set of data. Worth investigating in more detail for sure, also compared to search volumes from the Keyword Planner tool. Hope to have so time this weekend to write a decent blogpost, because I’ve seen some big differences in datasets so the question is, how accurate is the data Google is giving us?

  4. Andrew Martineau

    Thanks for including a screenshot from the recent (Oct’13) Google CTR study. Yes, I completely agree with you that specific organic click through rates for every site can vary quite a bit. As you said, almost every SERP can be visually different, and a user’s intent varies based on their position in the user journey – all of which can cause changes in user browsing/click behavior.

    The data set for the Catalyst study spans 17,500 unique search queries across 59 different CPG websites. While not even close to a drop in the bucket of the reported 5.9 billion Google searches per day in 2013, we still wanted to publish the study as the sample size was the largest of its kind as far as CTR studies go. Additionally, we segmented the query types in a few additional ways not previously reported.

    Every brand’s audience is different, so yes, internal studies will likely be more relevant and applicable for building forecasting models and strategies for one’s own site.

    p.s. I am fully on board with your idea for FreeRankingsProject.org. I really like the concept.

  5. Jason Kamara

    Thanks for the heads-up, Mike. I actually do spend a lot of time in WMT but never checked the Top Pages tab. The keyword impressions and clicks provide query data unavailable in Analytics even if it is limited to the past 90 days.

    Do you know of anyway, programatically, to export query data for multiple keywords at once? I’m not sure if it can be done with the WMT api.

  6. Corey

    Nice post Mike.

    A lot of noise using blog CTRs (google authorship) vs eCom (google shopping) vs local terms (local pack/google places vs brand). I would segment by vertical + site type to get some true number that we might all feel comfortable about.

    I think the studies (slingshot + catalyst), including yours, should tell us two things. 1) First is nearly 2x that of second. 2) Keywords falling between [4-10], that are driving some traffic, have big opportunity.

    C

  7. Scott

    good post Mike

    I was just wondering if the CTR data your displaying above is with or without brand terms as these would skew the data heavily.

    I have used the CTR info from webmaster tools a few times in the past and it seemed to give pretty accurate when you split keywords into page level groups – this enabled me to look at which pages needed improvements to title tags / meta descriptions and how much impact rich snippets were having.

    Whilst this info will not fully replace the data lost through not provided it does give us additional metrics to concentrate on and improve.

  8. Scott

    One thing I forgot to add, CTR rate shown is based on the impressions that page in Google is receiving and not core search volume per keyword so this also skews the data

Post your thoughts