A Sneak Peek into the History and Evolution of SEO

A Sneak Peek into the History and Evolution of SEO

Search Engine Optimisation refers to the set of practices used by webmasters to improve the ranking and visibility of web pages in search engines. It is difficult to pinpoint exactly when this practice came into existence, but most webmasters agree that SEO is a millennial, i.e., born in the early 1990s. But the term SEO is said to have officially originated around 1997. Even though SEO is still young, there have been a lot of developments in the past twenty-plus years. Let’s try to trace it back to its roots.

The story of SEO obviously starts with the invention of search engines.

  • Alan Emtage, the then computer science student at McGill University is credited for the development of the first well-documented search engine, “Archie” in 1990.
  • Veronica and Jughead were two search programs created in 1991.
  • Matthew Gray created World Wide Web Wanderer to produce an index called Wandex in 1993.
  • The W3Catalog was released on September 2, 1993.
  • Aliweb was announced in November 1993 and launched in 1994.
  • Jumpstation was created by Jonathon Fletcher in December 1993.
  • Yahoo! search was introduced in 1995.
  • Magellan, Excite, Infoseek, Inktomi, Northern Light, AltaVista, etc. are some of the search engines that came afterwards.
  • In 1996, Rankdex was developed. It was the first search engine with a page ranking algorithm.
  • Larry Page and Sergey Brin started working on “BackRub” in 1996, which eventually became the most popular search engine on the planet, Google.

In the early 90s, the working of search engines was quite simple; they just provided results that matched the keywords of the search query. This allowed the website owners to easily manipulate the search results by loading their pages with numerous keywords. This practice of repeated and unwanted use of relevant keywords is called “keyword stuffing.” Although this was unethical, it marked the beginning of SEO.

Manipulating content on the page was the only way to improve page visibility before the development of the Rankdex site scoring algorithm. The idea of using html links as a factor to determine ranking was introduced by Rankdex. The lack of techniques to validate the authority of the links encouraged people to stuff large numbers of spammy links into their web pages to rank better.

The search engines that emerged later wanted to rectify this problem in order to serve their users with authentic results. They started developing ranking algorithms and introducing frequent updates to remove its shortcomings and put an end to unethical SEO practices.

The early 2000s marked the beginning of the emergence of Google as a tech giant. It grew in popularity and most of the SEO practices started revolving around it. Google started issuing guidelines for SEO practices. Since it is the case, we will concentrate on how Google algorithm updates shaped SEO.

In November 2003, Google announced its first major algorithm update, Florida. It was a statistical link analysis algorithm launched to detect spammy links. This affected the rankings of many non-spammy sites too, and they were stabilised by February 2004. Even today Link analysis is an important aspect of SEO.

In September 2005, Google started implementing the Jagger update and it lasted till November 2005. Jagger’s goal was to eliminate duplicate content, unnatural link building, hidden text, cloaking, etc. Techniques like removing spammy backlinks, creating unique content, improving website quality, etc. enabled webmasters to recover from the aftereffects of this update.

In December 2005, Google rolled out the Big Daddy update. It was an infrastructure update to improve the quality of the search engine result pages (SERPs). Contrary to its other updates, Google released this update gradually, and it was completed by March 2006. Also, they tested it on two servers and shared its IP address with the SEO professionals for feedback. Many webmasters believe that this was a precursor to Google Webmaster Tools (Google Search Console).

The Vince update was launched in January 2009. It can also be called the “big brand” algorithm since it favoured well-established brands to gain higher positions in the SERPs. Initially, people thought this was unfair and Google was helping major brands, but big brands are built over time, they had trust, quality, and relevance and thus ranked higher.

The Caffeine update was announced on August 10, 2009. Since it was a major update, Google provided early access to SEO professionals and gathered feedback. This update brought major changes to the existing web indexing system. This enabled Google to crawl and index web pages quickly and deliver fresh content to its users. Caffeine was not an algorithm update; it was rather a redesign of Google’s indexing system. It had little impact on page rankings but some web pages showed a drop in rankings because fresh content was valued more.

The Panda update rolled out on February 23, 2011. This update is intended to provide high quality results to its users, thereby reducing the ranking for low quality websites. Many factors, such as shallow content, unauthorised links, the presence of excessive ads, etc. affected page rankings. The Panda update mostly hit content farms as they have an enormous amount of text content, so it was also called the “farmer update.” Web pages with original, well-drafted and researched content thrived during this update.

Google’s Freshness algorithm update was announced on November 3, 2011. This was designed to provide users with the latest and most updated content. The search queries were grouped into three time-related categories: recent events; regularly recurring events; and frequently updated events. The Freshness algorithm does not apply to evergreen queries.

Google launched the Page Layout algorithm or Above the Fold algorithm on January 19, 2012. It targeted sites that showed an excessive number of ads above the fold. As Google announced, it only affected less than 1% of the web pages. But it was a sign to not let the ads overshadow the content and hinder the user experience.

In February 2012, the Venice update was launched. It revolutionised the way local search works. Earlier, local results were displayed as a Google Places feature, but after Venice it was included in the “ten blue links” on the SERP. People were provided with localised content according to their set geographic location or by analysing their IP addresses. This update also enabled small businesses to rank better for high volume keywords with proper optimization. Building landing pages for specific regions and using location-based keywords became a common local SEO practice after Venice. However, it also gave rise to corrupt practices like location stuffing and making doorway pages.

In May 2012, Google launched the “Penguin algorithm” or “Webspam algorithm update”. As the name indicates, it mainly targeted sites with spammy or unnatural backlinks and keyword stuffing practices. Google has clarified that Penguin was launched as a continuation of its efforts to tackle low quality content that started with the Panda update.

The Exact Match Domain update rolled out in September 2012. Its goal was to eliminate exact-match domains with low quality content. This update mostly hit the “non dot com” domains with thin content. Web masters overcame its effects by providing authentic, unique, and relevant content on their sites.

The Google Payday Loan update was launched in 2013. It mainly targeted spammy queries and spammy sites, which were mostly related to pay day loans, casinos, pharmaceuticals, insurance, etc. After this update, more webmasters started to let go of manipulative tactics in order to enter Google’s good books.

The Hummingbird update came in August 2013. Although it was a rewrite of the core algorithm, it didn’t produce any noticeable effects. It mainly focused on natural language queries. Rather than just matching keywords to documents, it enables us to look for the context and intent in longer conversational queries. After this update, Google stopped looking at a search query word by word, instead started to look for what it really meant.

In July 2014, Google launched the Pigeon update. It changed the way local businesses appeared on the SERPs. Google improved its location and distance ranking parameters to provide ranking to deserving small businesses in the user’s proximity.

In April 2015, Mobilegeddon was launched, which was Google’s mobile-friendly algorithm. Prior to this update, Google had announced that mobile friendliness would be a ranking factor, thus the aftereffects were minimal and improved user experience.

In May 2015, Google rolled out another update, which was called the Quality Update by webmasters. It was a nightmare for many SEO professionals as rankings showed major fluctuations. Google has later acknowledged that this update affected the core algorithm of how it assesses quality.

Rank Brain is one of the most important algorithm updates Google has ever introduced. It was announced in October 2015 but was in operation a few months earlier. It is a machine learning algorithm built to understand the intent of the search query. The Hummingbird update has already started to identify search queries as entities instead of strings. Rank Brain applies to queries which are completely new to the search engines. Instead of just matching keywords, it thinks like a human to understand what the user really meant. It also measures user satisfaction and alters the result accordingly.

Google’s Fred update rolled out in March 2017. It was another update to eliminate pages with thin content and highly obtrusive ads. After Fred, more and more quality updates came. Ultimately, it is the quality of the content that matters.

In March 2019, a broad core update came. It was referred to as Florida 2 by webmasters. In October 2019, the BERT update was announced. The BERT (Bidirectional Encoder Representations) algorithm is used for processing natural language queries. Many other updates addressing problems such as featured snippets duplication, online harassment, and spam content were launched later. Also Google keeps launching core updates ever so often and broad core updates occasionally. With advancing technology and changing search trends, it is necessary to make such updates. 

Every year, Google launches thousands of updates, and they are mostly unpredictable. Also, Google provides only a little information about the process. But all these updates aim to improve the quality of content and user experience and reduce black hat practices. Thus, the only way to overcome its effects is to always provide the best to the users. So, web masters, brace yourselves, only those who deserve are ranked high. The world of SEO is constantly subjected to change and is ever evolving. In order to survive, we have to keep our eyes open for changes and always deliver the best to the users.