Site icon Search Engine People Blog

SEO Insanity : Confusing Correlation With Causation

Stay Connected with Us!

Something I see all the time from both my fellow webmasters and from search engine optimizers is the problem of confusing correlation with causation. Frankly, it drives me nuts.

Let me explain.

This is also known as the “cum hoc fallacy.” Basically, it's the assumption that because two things happen together, one of them must have caused the other.

So how does this relate to search engine optimization?

Well, part of the problem for SEOs (and their clients) is that the process and criteria the search engines use to rank pages is pretty much a “black box” to those of us standing on the outside. By that I mean, we can try to figure out how they do it, we may even feel we've got a pretty good handle on at least some of the factors, but we don't really know everything. Not for sure. Not all the details. (And, you know, the devil often lives in those details.)

Add to that the fact that the search engines themselves are changing their algorithms almost on a daily basis. (For instance, refer to this Google 2008 Founders' Letter, in which Sergey Brin says Google made 359 changes to their web search last year.) By the time you've managed to reverse engineer today's algorithm, the search engines could have implemented dozens, or even hundreds, of changes you haven't yet factored in. Algo-chasing is simply a losing game in the end.

So one of the secrets of the SEO industry is that even the best practitioners are, to some extent, feeling their way in the dark. They may declare things to be immutable, incontrovertable facts on their blogs, their corporate websites, at conferences and seminars... but the truth is, in many cases they've simply deduced through trial and error over time how — in general — things appear to work at the moment (with no guarantees they'll continue to work that way in the future).

Why do you need to understand this?

Because not understanding it can lead you to make bad decisions based on bad information.

Photo by laurenatclemson

I see it all the time on the forums where I hang out. Somebody will make some minor change to their website, and along about the same time they perceive some difference in their search rankings, or their traffic or their sales, and they immediately leap to the conclusion that what they did caused the change. Or they'll zero in on one factor they think is “important” to cause (and maintain) their rankings while disregarding other, possibly more important factors.

Now, they might be right. For expample, an outfit called Marketing Experiments regularly conducts tests and has documented many cases where apparently minor changes have led to major differences in clickthrough rates, conversion rates and profits.

But in my experience, more often than not, the website owner or webmaster has jumped to the wrong conclusion.

For instance, there are the webmasters who think because they added some words to their keyword meta tag, this caused their Google rankings to drop. So they panic and remove the “extra” words, then panic further when their “correction” doesn't fix the problem. (Which it would be highly unlikely to do, considering it didn't cause the problem in the first place — Google doesn't even index the keyword meta tag contents.)

If you or your SEOs make an update to your site and along about the same time you notice a difference in your rankings or traffic or sales or any other factor that matters to you, don't simply assume the update caused the difference. If it turns out you assumed incorrectly, you could find yourself wasting time and/or money making unnecessary changes to your site that won't cause the effects you desire.

First, be sure you've looked at all the other reasonably possible factors. Sure, your automated tool says you increased your keyword density from 3% to 5%, but can you be sure that's what caused your rankings increase? Were there any algorithm updates along about that time? (With 359 in one year, it's almost certain there were — most were probably minor, but can you be sure none of them affected your pages?) Your website doesn't exist in a vacuum, so did any of your competitors make any changes to their pages? Are you subject to any known market trends? Economic conditions? Did you (or any of your competitors) launch any new advertising or marketing campaigns during the same time frame?

And second, test your hypothesis. Scientists don't consider an experimental result to be valid unless it can be repeated. If you think your change caused the effect you're seeing, try reverting temporarily to the old version of the page(s) in question and wait a week or two. Does the effect also go away? Lather, rinse, repeat. If you can repeat the action and get the same results, then you might be on to something. Otherwise, there's a good chance the original apparent cause-and-effect was simply coincidence, or due to other as-yet-unidentified factors.

Or better yet, use the free Google Website Optimizer tool to test your proposed changes, and their effects on the things that matter (like conversions and sales) before you put them in place.

Bottom line: to reduce your chances of making bad decisions, try to avoid the traps of making bad assumptions and leaping to bad conclusions.