Research tools - laptops and handhelds for mobile usability testing

isham research

Why Now?

Phil Payne is a Google Bionic Poster and Top Contributor based in Sheffield UK. The opinions here are not Google's

Very often, when Top Contributors on the Google Webmasters Help Forum suggest a possible cause for a website's poor ranking, the response is something like:

"But it can't be that - it's been like that for seven years."

Oh yes, it can. Although no one outside Google (and few within) knows exactly how Google's systems work, some things are just so sensible that Google has to do something similar.

There are at least 100,000,000 indexable domains on the web. How many pages there are is a matter of debate - but it's in the tens of billions. The vast majority of these pages will never appear in search engine results and even fewer will ever be read by a human. So there is no point in Google checking every page in the index for breaches of the Google Quality Guidelines - it would be a huge waste of resources.

The obvious thing to do is to look at the pages that play an active role on the web. And this can be done in background or asynchronous processing - Google always keeps a copy of each page in its "cache" so it can be done at any time without involving the site at all. There are those who believe Google uses cycles freed up when search volumes are low - hence the many ranking changes after Thanksgiving or Christmas.

So how does a page get marked - queued - for such processing? The same way the police spot drunk drivers - behaviour that is out of the ordinary. Perhaps having more than a certain number of inbound links. Or suddenly getting even a few new ones after getting none for years. Who knows?

Google has said on many occasions that its "algorithm" changes up to four hundred times a year. Any one of these changes could move the boundaries of the envelope.

And there are always competitors. Getting a given page close to the top of the search engine results is quite easy - but getting it right at the top is harder. One way to struggle up those last few places is to check out the competition - and fill out a spam report for any sites doing anything naughty.

There's another aspect. What happens if a page or a site has two problems that go undiscovered for years? Reconsideration, via the Webmaster Tools interface, is a manual process. So even if one problem is correctly identified and fixed, the second may prevent lifting of any penalty. "Black hat" SEO, when discovered, can be very expensive - worst case a complete redesign.

And all of the above, of course, is pure hypothesis. But it fits the facts. And it really is possible for breaches of the Google Quality Guidelines to go unnoticed and unpunished for years.

Contact by email or use mobile/SMS 07833 654800