Google search algorithm updates history (1999-2013)

In an attempt to regulate the search engine ecosystem and prevent search engine manipulation from webmasters, Google frequently updates its search algorithm, responsible for website rankings and user searching experience. These changes often cause variable issues and challenges to SEO tactics in a disruptive manner making website visibility in search results an increasingly difficult task for webmasters.

What is a Search Algorithm?

In computer science, a search algorithm is an algorithm that identifies items with specific properties among a collection of items. In plain words, a search algorithm refers to the secret method search engines automate the relevance of webpages. In the early days of the search engines algorithms were simple and vulnerable to manipulation. Opportunistic website owners figured out how to manipulate search results simply by repeating words through methods that were later named as black hat SEO, encompassing all the processes related to creating fake relevance and popularity for certain webpages.

Google algorithm updates per year

The numerous updates applied by Google on its search algorithm since 1999 have always aimed to combat ranking manipulation by webmasters and deliver the best possible results to searcher intent. At present, most webmasters struggle to stay updated with all algorithm changes (over 500 per year) while others see their websites get penalized for black hat tactics that were used successfully in the past. A brief description of the latest updates and their relevant objectives, illustrates the everchanging ecosystem of search.
 
Google BETA in 1998-1999
Screenshot when Google was still in BETA version!


The history and purpose of Google's algorithm changes

From the rather unsuccesful updates for onpage factors (1999) and offpage factors (2000-2003) to the introduction of the first major update in 2003 (codename: Florida) that "put SEO into map", Google made it clear that algorithmic updates aim to help the search engine deliver the best possible experience to searchers. 


Next, the Everflux update that introduced daily crawling in order to discover the most relevant and fresh content across the web, the controversial Sandbox update, the Jagger update in 2005 that targeted at low quality links, the introduction of Universal Search that integrated most of the numerous results results we see today in SERPs such as news, images, maps, videos, etc.  



In an attempt to eliminate spamdexing and PageRank sculpting, Google introduced the "no follow" value in 2008, the "rel=canonical" in 2009 (a solution for on-page duplicate content issues), the "Panda" update in 2011 and finally the "Penguin" update in 2012.

While the "no follow" element was designed for webmasters to facilitate the optimization of their webpages by selecting which hyperlinks the Bots should crawl and index , "Panda" aimed to lower the rankings of low content quality sites.


The use of ranking methods not approved by Google has led to the penalization of websites that implementing such techniques. Examples include methods such as keyword stuffing, invisible texts, content scrapers, cloacking pages and article spinners.



The "Penguin" update targeted both content and link spam. Apart from the aforementioned, pages using other black hat techniques such as automated link building software (still a very popular and sophisticated method), hidden links, link farms, sybil attacks, page hijacking, domaining and cookie stuffing, most of them plummeted in SERPs.


With the new 'Hummingbird' update already live, all seem that the search algorithm is shifting towards semantic web by aiming to provide more intelligent and personalized results to searchers. It remains to be seen what the impact on referral traffic from SERPs to websites would be in the future given that Google will push its content further through the personalization of SERPs.

More importantly, the NSA/CIA revelations in 2013 gave Google cover to implement secure connections on all searches, which has in turn led to the loss of vital data for SEO with (not provided) levels reaching 80% in late 2013. According to Pubcon Founder by "not having any data to work with has been the biggest hit SEO has taken since the Florida Update 10 years ago".

All images captured from Webfermento.

Factors, however, such as fresh and high value content, author rank and natural link building will continue to be essential to ranking in the (transforming) SERPs in the next years.

Although algorithm updates have caused troubles to most webmasters, it can be argued that these changes have contributed to the development of meritocracy in the search ecosystem, which is in a transformational phase that all seem that it leads to the devaluation of organic results as plain website listings to something that the guys at moz describe as the rise of the "Mega-SERP".

March 13 (updated Jan 14), by Vangelis Kovaios

You can find an insightful presentation about Google algorithmic "animals" and how they have affected SEO over the last years.