Google remains the giant of search engines. Accounting for 70% of the search engine market in the USA and a whopping 90% here in the UK, its dominance cannot be argued. However, recently the search giant has come under some tough scrutiny and faces an investigation by the European Commission into whether or not it penalised its competitors in its results.
All of this once again raises questions over whether or not the algorithm should be public and (despite the fact that Google is unlikely to ever even consider publicising it) the blogosphere has been alight in previous months with arguments both for and against the publication of the complete algorithm.
My opinion? It would be entirely catastrophic for the Google algorithm to be made completely public. Now, granted, I’m not talking about catastrophic on a natural disaster level. But in terms of how users go about finding relevant websites online it would be pretty dire.
The main benefit that publicising the algorithm would offer would be peace of mind that it’s as fair as it should be. But the downsides are immense – and not just because I would probably be out of a job!!
As SEOs, we spend much of our working time trying to meet criteria in an algorithm that we essentially have no real visibility over. Of course, the basic criteria are publicly available. We all know that meta titles and meta descriptions are key and that links to your site from other sites are critical. But we also know that some links could have a negative impact and that loading parts of your site with “too many” keywords could result in a penalty. We know good practices that Google openly publishes and we know what would constitute bad ones. But this only scratches the surface of SEO.
Testing, experimenting and tailored strategy planning are absolutely vital. It’s not about “gaming” Google, but about finding ways to give it exactly what it wants and as there are literally hundreds of “minor” algorithm updates (plus some bigger ones) each year, this changes. Certain strategies have less value over time while others offer more.
So with all the time we spend trying to give it what it wants, why would we not want Google to tell us exactly what the deal is?
Simply, because it would make the search engine less valuable. It would give rise to “spammy” websites that are designed simply to meet the search engine’s every last criteria but that add no real value. It would ultimately mean poorer quality results and this in turn would mean fewer users. Users would gradually drift off to other search engines where they actually find precisely what they are looking for and get a list of higher quality results.
The algorithm as it stands is by no means perfect. It is open to abuse and does get abused. There’s questions over the seriousness with which the search engine takes complaints about spam sites sometimes and I have personally carried out searches where the top couple of results have been irrelevant and poor quality. But for the most part, it’s pretty incredible. Indexing 100,000 pages per second and delivering 90 billion sets of search results per month, it’s a hard working search engine to say the least. And the vast majority of these searches do produce relevant results and ultimately point the user in exactly the direction they were looking to go.
It’s a work in progress. Every major update is geared more and more towards cracking down on spam and rewarding high quality websites and SEO techniques. And despite the fact that we might tear our hair out with some of the seemingly random additions and changes at times, there’s no denying that it’s a pretty slick setup!