How Has SEO Changed Over The Last 18 Years?

SEO History

Since the internet was created for the mainstream audience there has been an ongoing battle to create and optimize websites for the sake of increasing search engine rankings. However, this has never been a steady road.

Throughout its history, SEO has gone from throwing in keywords in white at the bottom of a website with vast amounts of link farms, to creating high quality content and adopting a strong link building strategy. As Google controls that vast majority of search traffic, when Google makes a change to its algorithm it effects the entire world of SEO. In this article we will look at how SEO has changed over the last 17 years of its existence.

SEO In The 1990’s

Throughout the 1990s there were several search engine projects developed such as Architext created in 1993 by six Stanford students, Matthew Gray’s World Wide Web Wanderer, along with Ask Jeeves in 1997. It wasn’t however until the early 2000s with the launch of Google’s Toolbar PageRank (TBPR) that we arguably start seeing webmasters paying close attention to measuring the rankings of websites. TBPR allowed webmasters to for the first time comprehend what search engines and specifically Google perceived to be a good website. You could argue that this is the point that SEO started, as it gave a way for webmasters to measure the success of optimization techniques on their websites. It was at this point that Google released its white hat guidelines that would allow the correct websites and best quality websites to appear at the correct place. However, whilst these guidelines existed, they were largely ignored as Google’s algorithm was fairly easy to cheat in that day.


The first documented update to Google’s algorithm was launched in 2002, after which many webmasters suffered heavy casualties. Webmasters complained of ranking within the top 10 results only to completely drop off the search engine results overnight. Prior to this shift, Google would update its algorithm every month with changes here and there, but this was an algorithmic update that would change the entire world of SEO. People complained that relevant searches were being deranked in favour of irrelevant search results. This was due to the way in which this update affected those who had optimized their site on what was previously considered perfectly acceptable link building techniques. After this change, Google began its mission to create a search engine algorithm that focused on providing the user with the most relevant content possible.

During 2003 Google launched eight documented updates in a major attempt at cracking down on black hat SEO. The updates were; Boston, Cassandra, Dominic, Esmerala, Fritz, Supplemental Index update (September Update), and Florida. Cassandra was the first update to come down hard on hidden text and hidden links along with dealing with issues such as large scale link farming. The most notable update was the final update of the year “Florida”, which Search Engine Journal has claimed to be Google’s first major update, and the update that changed SEO forever. After this update, all previous  spammy tactics of SEO e.g hidden text, keyword stuffing, multiple sites under same brand were completely eliminated as a method of improving a website’s SEO. This was a time when retailers relied drastically upon affiliates for their traffic, and affiliate based websites were some of the worst hit by the algorithmic update. In return retailers saw a drastic reduction in website traffic as a result of the update. This triggered a tremendous amount out of outcry from companies, since some were going out of business and some even threatened to sue.

2005 marked a massive change in the world of SEO with the introduction of the no-follow attribute. This was a time in which Google joined forces with MSN and Yahoo to crack down on spammy blog comment links. As of today the vast majority of links found in comment sections of websites automatically use the “no-follow” attribute. It was also the year in which Google launched the Google Analytics tool created to help webmasters have a better understanding of the operations of their website along with detailed traffic statistics. In June 2005 Google launched its personalized search function whilst this is a far cry from the personalised search that Google now uses today, it was a drastic improvement from Google’s previous attempt at personalising search results. The Jagger update was an update targeted at reciprocal links, link farms and paid links. Other updates in 2005 also included the introduction of XML sitemaps that could be submitted through webmaster tools, along with the introduction of Google Local/Maps which drastically grew the world of local SEO. A final update which rolled over until March 2006 was known as “Big Daddy” changing the way that Google handled URL canonicalization and 301/302 redirects along with other technical issues.

Throughout the latter half of the noughties there were relatively small updates from Google that to varying degrees had an impact on SEO. However it wasn’t until 2009 that we started to see Google again implement major changes. It was the year that Bing argued that it was going to show drastically better and more relevant search results than Google- a plan that never came into fruition. It was a year that the “Caffeine” update was first previewed, an update aptly named for its focus on drastically boosting Google’s crawling and indexing speed. It wasn’t until the year drew to a close in December 2009 where google found a way of implementing real-time search with Twitter Feeds, Google News and newly indexed content. Prior to this, SEO and search engines only really focused on websites that were designed for the long-term, now journalists and the news world needed to focus on optimizing their content and entering into the world of SEO.


2011-2012 marked two years that many SEO agencies went bust practically overnight, and the reason for this was the Panda update. For a while now many webmasters realized that .edu and links with hypertext such as “van repair” or “car for sale” would result in a higher serp ranking. After Panda tactics such as these had been abused so much, sites that attempted similar practices were deranked. The first major focus for Panda was “thin content”. This would be websites that offered hardly any quality content that were clearly set up for the purpose of increasing search rankings. Duplicate content (content that is somewhere else on the internet) suffered exceptionally heavy repercussions. Low-quality-content that essentially offered no useful value to human readers was swiftly deranked. Authority also became an increasingly important factor in which Google now turns to and stated that they would focus on websites with links from trustworthy sites rather than sites than low authority blog sites. Content farming (creating a website with tons of low quality content with its sole purpose of increasing the serp rankings of other websites) was also cracked on. Further updates included high-ad-to-content-ratio, low-quality content surrounding affiliate links, websites blocked by users, content mismatching search queries etc. Additional important updates include the introduction of schema markup and Google +.

The final massive update from Google was the Penguin update launched in 2012. This was an update that aimed at taking down the more subtle black hat techniques however many have argued that it negatively affected websites using what was previously considered white hat techniques. It meant that sites using spammy hyperlinks around good quality content suffered deranking in the SERPs. There have not been many updates since then apart from a heavier focus on local search which started in 2015 along with a stronger focus on mobile-friendly websites. Google continues to update its algorithm and the way in which the search engine operates, however recently there have been no major changes. We can only speculate what the next great update (if there will be one) will be.  The likeliest theory I would put forward is a clampdown on high-quality PBN’s which continue to exist and many have argued produce positive results. These are blog networks owned by SEO professionals that produce high quality content, with non-spammy links, designed and optimized well but exist purely for the purpose of boosting search rankings. It’s something that google has said it is against but at the moment is finding very difficult to detect, 2018 could be the year that this changes.

Author Bio:

Rob SEO Article Heroes


Robert Bailey is the managing director of a digital marketing agency specialising in SEO based in Cardiff called SEO Article Heroes.


I hope you enjoyed this post from Robert on the “History of SEO”…Feel free to contact us with any further questions!

Leave a Reply

Your email address will not be published. Required fields are marked *