Digital / SEO

Outdated SEO Practices: More Harm Than Good

|

Today, we all know and understand the importance of SEO. We recognise that it plays a crucial factor in a business’ digital visibility. But what we may not know is all SEO techniques are NOT alike. 

Did you know that some outdated SEO practices do more harm than good? How? Search algorithms are constantly changing, which means a technique that may have worked months ago is no longer effective. When a search algorithm changes, so do the factors that determine 1st-page rankings. In this blog, we will explore and understand these factors, when they became obsolete, and how your site could be impacted if left unchecked. 

SEO Ranking Factors Become Outdated:

Since 1998 Google and other search engines (like AltaVista, Ask Jeeves, Excite, Infoseek, Lycos, and Yahoo, etc) have been competing to organise the world’s information and make it universally accessible in a way that delivers the best possible results. To do this, algorithms were introduced with the sole purpose of unearthing and ranking quality content based on user-specific search data. 

These algorithms are constantly changing. Moz has identified that Google rolls out approximately 500–600 improvements every year.  With the number and frequency of changes drastically increasing in recent years, once popular areas of focus for SEO Specialists have fast become outdated.

1. Keyword Exploitation (Stuffing)

Example of keyword stuffing technique

While Keywords have remained an important factor in impacting your SEO performance, a large portion of Digital Marketers have started to lose the plot with understanding the role of keywords, and how they should be used. Below we’ve listed some of the most common types of outdated keyword exploitation (stuffing) techniques:

Keyword Density

Many traditional SEO practices focused on writing content with a fixed ratio of a specific keyword, in proportion to the overall page word count. While this was a common practice driving results, Google no longer uses keyword density as a ranking factor to determine if a page ranks well.

Keyword Stuffing

One of the most common old school SEO techniques was to pack webpages with keywords that a business wished to rank for. Keyword stuffing practices could include, but are not limited to the following;
– putting every plausible keyword variation into the website footer
– scattering the keywords throughout the webpage
– hiding them from human visitors by matching text colour and site background colour. Essentially making them invisible to the naked eye, but not to the bots (algorithm)

While these methods may seem like brilliant strategies, search engine algorithms have caught onto the practice of keyword stuffing. They now see this practice as deliberate attempts to manipulate search results, and penalise accordingly.

Associated Algorithm Updates 

The following updates by Google were used to catch out users engaging in keyword stuffing:

  1. Florida  —  November 1, 2003
  2. Austin  —  January 1, 2004
  3. Penguin  —  April 24, 2012
  4. Panda 4.1 (#27)  —  September 23, 2014
  5. May 2020 Core Update  —  May 4, 2020

2. Writing for Search Engines (Crawlers)

Search Engine Crawlers Reading a website which is only optimised for them

Back when SEO began, crawler algorithms were quite simple and this resulted in the content being unnatural. In other words, content writers needed to write for the web, not the audience. Repeating the same keyword every time an opportunity presents itself, using variations (acronyms) and plural/non-plural versions of the word ensuring “all bases are covered.”

Today’s search engine algorithms have come a long way, and are smart enough to understand when keywords are used appropriately, when variations of keywords are used, and when the content is of a low-quality information-wise.

Associated Algorithm Updates 

The following updates by Google were used to catch out users writing for robots:

  1. BERT (Worldwide)  —  December 9, 2019
  2. May 2020 Core Update  —  May 4, 2020

3. Faking Your Backlinks

Back links from purchased sources linking back to website

Most businesses struggle with backlinking and getting domains to link back to their website. So, to combat this digital marketers found loopholes and tried cheating the system to get to the top by:

Purchasing of Backlinks

One method of doing this was through the purchase of links from backlink sellers. This could take the form of websites just linking to other pages, or back link sellers from 3rd world countries, websites linking to every possible unrelated industry, etc.

Now, this routine practice of buying links can get your website penalised. Businesses can no longer ignore the management of off-page SEO factors. Instead, webmasters need to manage low-quality and high-quality referring domains strategically. 

Backlinking from Comments Section

Another commonly used technique was comment spamming. Individuals would just visit other webpages, go to their comment section and leave a link, with no actual relevance to the topic of the article where the comment was made. While this trend hasn’t died out, website managers have become smart, and manually filter comments or apply the “noindex”. Or they just disable hyperlinks.

Using Link Exchange Software

Another technique commonly used was employing software to automate link exchanges. This software takes the model of “I’ll add you if you add me” (Like-for-Like) and automates it. It inserted a hyperlink in one of their pages in exchange for a backlink.
The problem is the websites you linked to wouldn’t always match up. It could vary deeply, and the link could be buried somewhere. 

Associated Algorithm Updates 

The following updates by Google were used to catch out users inappropriate linking practices:

  1. Dominic  —  May 1, 2003
  2. Jagger Update —  September 1, 2005
  3. Big Daddy Update —  December 15, 2005
  4. Penguin  —  April 24, 2012
  5. May 39-Pack  —  June 7, 2012
  6. Penguin Update 2.0 —  May 22, 2013
  7. Penguin Update 3.0 —  October 17, 2014
  8. Penguin Update 4.0 & Core Algorithm Integration —  September 23, 2016
  9. February 7 Update —  February 7, 2017

4. Over Optimised Anchor Text

Example of Overoptimised anchor text linking back to website

I am sure you are aware of how valuable internal linking is for structure, user experience, and ranking. Back in the day, anchor texts and internal linking were equally important. The major difference was the factor used to assess something as a good internal link. 

In the early days’ Anchor text, taking the form of exact-match was given priority. But with algorithm updates, Anchor Text forms that used; branded, naked, exact-match, brand name, page title headline, etc., started getting greater importance depending on the situation. Now the priority lies in content sounding organic (natural), or readability (easy to understand). 

Associated Algorithm Updates 

The following updates by Google were used to catch-out users’ inappropriate linking practices:

  1. Google Toolbar  —  December 1, 2000
  2. Brandy  —  February 1, 2004
  3. Jagger Update  —  September 1, 2005
  4. Panda 3.4 (March 50-Pack)  —  April 3, 2012

6. Optimising for Text Only

SEO companies giving weight-age only to text optimisation

We all know that text (copy, content, etc…) is an essential component of SEO, and most businesses tend to solely put their efforts into text-only optimisation. While this doesn’t necessarily harm your SEO strategy, it does limit SEO capabilities. 


Google has been working on pushing its Image, Video & Voice Search features. The way in which people search using these mediums are different, and there are additional SERP factors that come into play.  So by restricting your SEO optimisation efforts to text only, limits the scope of your ranking. 

Associated Algorithm Updates 

The following updates by Google were used to give other media higher weightage:

  1. Universal Search —  May 1, 2007
  2. December 10-Pack —  December 1, 2011
  3. January 30-Pack —  January 5, 2012
  4. February 40-Pack (2) —  February 27, 2012
  5. Panda 3.4 (March 50-Pack)  —  April 3, 2012
  6. Video Carousels  —  June 14, 2018
  7. May 2020 Core Update  —  May 4, 2020

7. Trying to Dupe Crawlers

Search Engine Crawler jumping through multiple pages to view the website content.

When SEO began, web crawlers were a lot simpler. As a result, we had greater control of what crawlers should do, and what page should be given what weightage (0.0 to 1.0).

However, Google soon caught on that this weightage wasn’t being used as intended, dropping the ‘honouring of frequency’ rating system. in favour of a quality and substance rating system.  

Always remember you should never try to manipulate search engine crawlers. Rankings are now based on the quality and relevance of your content. Trying to dupe crawlers for better ranking results will have the opposite effect. 

Associated Algorithm Updates 

The following updates by Google were used to give other media higher weightage:

  1. Universal Search  —  May 1, 2007

 

Staying on top of SEO updates is more critical than ever before. Once a change has been implemented, the effect is immediate. So, if it has been a while since your last SEO update, your site and strategy may be negatively impacted. 

Get in touch with Pounce today, and let us help you get back on track. 

Share us on:
FacebookEmailLinkedIn

Recommended For You

Digital / Marketing / SEO
10 Killer Ideas for SEO-Friend

It’s 2021 and SEO-friendly content is still King/Queen of the...

13 Jan,2021 | Rahul Sengupta

Digital / SEO
Outdated SEO Practices: More H

Today, we all know and understand the importance of SEO....

30 Nov,2020 | Rahul Sengupta

Digital / Marketing
How to write really good conte

Unlike Facebook and Twitter, LinkedIn is tailored towards businesses and...

23 Jul,2020 | Simran Kaur

Digital / Marketing
How content can improve your m

It can be hard to equate the number of social...

16 Jul,2020 | Simran Kaur

Digital / Marketing
The power of re-optimisation f

Re-optimising content is a great way to use your existing...

16 Jun,2020 | Simran Kaur