Whenever I get a chance, I discourage marketers (or anybody else) from trusting web metrics companies like Quantcast, Alexa or Compete. In my experience, the numbers they report are consistently unreliable. I did a little ad hoc analysis to illustrate this point.
The publishers of the social news site Reddit provide further evidence of this phenomenon. They posted a screenshot from their Google Analytics account, indicating that their site typically receives about 8 million unique visitors a month.
They then compare these results with Compete (927,000 visitors), QuantCast (about 5 million) and Neilsen (625,000). Clearly none of these numbers are particularly useful to somebody trying to gauge the popularity or influence of a particular site.
It’s shameful how frequently the tech and marketing industries trust these sites. It’s a classic illustration of how people prefer highly dubious or inaccurate statistics over none at all.
In our book, we quote SEO guru Vanessa Fox on this topic:
All of the services are fairly notoriously unreliable. They all use different methods for gathering data that make them inaccurate by nature. Alexa, for instance, uses the Alexa Toolbar, which is skewed toward a certain user demographic. These tools are useful in a couple of ways, however: for trending over time and for comparisons. If you use one tool to gather data on these two things, then although the data will be unreliable, it should be equally unreliable over time or among sites, so the trending should be fairly accurate.
What’s the best way to determine a site’s popularity? Ask the site’s owner.
Google is and almost always has been king. Until that changes I will continue to focus on them.
When you rank high in one, it will usually help you in others.
Actually, at least as far as Compete is concerned, I’m finding it’s not even consistently inaccurate – it flails all over the place for the same site over a period of months.
And most of these are even less useful for Canadian sites than for US, which is saying something.
Paul
My theory on Compete and Quantcast is that they intentionally under-report traffic for sites which don’t use their plugins as a means of forcing webmasters to use their respective plugins.
Their numbers are just too far off. For example, I track my own traffic meticulously and receive just south of 150k uniques every month. Compete and Quantcast show me as having wild fluctuations in traffic somewhere between 20k and 80k visitors per month – and they differ vastly with each other.
These numbers are just hilariously wrong. (Google tracks correctly, because I have their analytics code embedded in each of my pages).
What’s funny is how many meetings I go to where someone trots out the data from Compete and/or Quantcast as if it’s actionable data. I usually just smirk and let them carry on thinking they know something, when they might as well have just made up the numbers…