This might be a bit of a rant.
I’ve long felt CTR studies were not comprehensive enough to tell the complete story of user behavior, but unfortunately that’s all we’ve had to go on. Although I am a huge proponent of market research I’ve found the shortcomings of CTR studies to be too much to shake. After all we’re an entire industry convincing people to make considerable investments for 18.2% of search volume? When you think about that it seems like a no-brainer to flip the Paid Search switch when we continue to say PPC gets 20-25% of the clicks. However over the past few days all of that changed.
@iamchrisle nope…but we sure have been acting like it was. Ha!
— MyCool King (@iPullRank) January 7, 2014
Keywords (now provided)
Unless you’ve been ignoring most SEO conversations as of late you know that (not provided) has taken up much of the bandwidth since someone realized we’re fast approaching 100%. Aside from Matt Cutts’ bragging about how he’s trying to break the spirit of Black Hats and a few sites getting banned it’s been a pretty slow SEO news cycle.
Despite that many people tentatively predicted that Google Webmaster Tools would be the place that that data would ultimately resurface.
They were right.
Go ahead and dust off Google Webmaster Tools. I know you haven’t logged in for a few weeks. As long as you are on top of your on page game it’s really not that actionable. I know. I’ve been hard at work on a new toolset trying to figure out how to leverage the data creatively. There hasn’t been much that I would care about on a daily basis.
But I digress.
In the Search Queries (it’s under Search Traffic on the left. It’s ok. Like I said I know you never come here) section you can now see keywords versus landing pages in the Top pages tab.
Yes, Google gave you back your keywords.
How accurate is it? Well with Google Analytics sampling becoming more of an issue it’s hard to tell on the high volume sites that I have access to, but I’ll grab my site and some data from a bigger site to check against.
The only thing the review of my site reveals is that I need to start posting more, but for arguments sake GWT is off by an average of 4% for the URLs I was able to pull.
Now for a bigger site that gets a good amount of traffic. I’ve pulled 50 URLs and compared them only to find out that GWT is off by an average of ~17%.
My sample sizes here are obviously quite small. This would be worth someone examining in more detail across a more sites for far more URLs than the 16 and 50 I’ve looked at here, but despite the cute little note below you shouldn’t expect the same accuracy you’re used to out of this new report.
Roll Your Own CTR Model
You already know my stance on (not provided), so while this added data is really cool, it’s not really what I’m excited about. What I am excited about however is the Top queries section that allows me to export data that looks like this:
That’s right, for the first time ever Google is giving us exact CTR data per the position that a given keyword has appeared in. In one fell swoop Google has rendered the various CTR studies completely useless. From now on we have the ability to develop CTR models based on our own sites rather than models like this:
The problem with existing CTR models is that it’s impossible to have a sample size big enough to make a strong enough conclusion. Every vertical is different. In fact every SERP is wildly different with the inclusion of knowledge boxes and rich snippets so it’s pretty reckless to build predictions off of these industry standard models.
What would be ideal is for an analytics company like Moz or Conductor or even Experian who sniffs keyword traffic from ISPs and such to build a series of far more robust CTR models based on millions of sites segmented by vertical. I’ve even kicked around the idea of building something called the FreeRankingsProject.org that allowed everyone to track unlimited rankings in return for anonymized access to their analytics data… but, again, I digress.
All of our hearts have been in the right place in building the sample-deficient CTR models so, but now we can build our own models from this data.
Unfortunately, Google makes the data pretty difficult to get at scale, but the general premise has you collecting your data from the bottom of the individual Search Queries pages and creating something that looks like this.
That in turn gives you your own CTR curve that you might find is far steeper than the curves you’re used to looking at.
In fact from the spot checking I’ve done across various Webmaster Tools profiles while writing this I’ve noticed that most #1 spots are reported as being far above 18.2% (Slingshot), 36.4% (Optify), etc. However at the end of the day you’ve got to talk it all with a certain grain of salt. After all we are talking about Google. ;]
Like I said it’s a rant, or quick brain dump after checking out a new feature. I figured I’d dust off my blog so I break this writer’s block!
Hat tip to Cyrus Shepard for tweeting a screenshot that led me to check out these new features.
So what do you think about this new data, CTR studies? How should we handle predicting success? Industry standard CTR studies or roll our own?
**UPDATE: The folks at Branded3 already developed a method for calculating true CTR from WMT data a few months back. Check it out!