Algorithm Rolled Out: March 8, 2017

Algorithm Overview

The Google Fred algorithm update released to eliminate what Google considered low-quality results, such as sites with little content and aggressive ad placement.

Many, but not all, were affiliate sites. The bulk of them relied on content as their primary source of traffic. Normally, we hear Google advising people to do just that.

You’re familiar with the locations. You most likely avoid them. However, the material quality on the impacted sites was generally poor, with a high prevalence of advertisements.

While Gary offered us a name for the update, he didn’t provide us with a list of the topics they were covering other than the following:

  • There are much too many advertisements.
  • Content is sparse.
  • The link is of poor quality.
  • The stuff is of poor quality.
  • Aggressive affiliate linking.
  • Interstitials are overpowering.
  • Ads that are misleading (ads that appear as content).
  • The main content/supplementary content ratio unbalanced.

All you need to know about FRED Update

Sites with poor quality and thin content – The majority of sites that could not cope in terms of search engine rankings and organic traffic were those with “SEO content” – content created solely to improve search engine rankings rather than around the user’s search intent. This is a strong indicator that Google is sticking to its objective of providing high-quality material tailored to its users’ demands.

Revenue Generating Sites

Sites that generate revenue or leads yet have a lot of advertisements and not enough content developed around the target audience’s search intent.

Sites carrying low-quality backlinks

The sites with low-quality backlinks, i.e., sites with a low domain authority link back to yours.

Fred’s Timing

Fred has a unique sense of timing.

Fred preceded with a large Google Core Update a month prior, which reported to focus on E-A-T.

Google unveiled Project Owl a week after Fred, intending to remove inaccurate and objectionable content based on comments from their quality raters. The raters were only in charge of teaching the algorithm to spot false or objectionable content, not of deciding which sites should be removed from the results. Google was clearly focused on quality, as seen by their utilisation of data from quality raters. Fred was no different.

Algorithm Solution: Ways to implement or take to cope with Google algorithm guidelines

  1. Monitor Your Website Traffic

Keep an eye on your website statistics to see how these changes are influencing your site’s rating and traffic. Check the dates of the adjustments’ dates versus the algorithm updates’ dates if you see strange changes in your rankings or traffic.

2.  Generate Quality Content

It’s not all about the word count or keyword density when creating high-quality content. Make sure your content is relevant and delivers unique value to the reader. It fulfills the requirements of your audience, and appeals to them, even if it is short on words.

Keep an eye on important indicators like bounce rate and time spent on the page using a website auditing tool to see what content is ranking and generating traffic to your website. This will assist you to figure out what’s working and what’s not so you can make the required content modifications.

3. Superior Backlinks

If you discover that a site unrelated to yours has connected to you, contact them and request that they delete the connection, or use Google’s Disavow Tool to remove the links. You will be able to avoid a Google manual penalty if you do so. If you have any affiliate links, make sure they all have the ‘no follow’ attribute applied to them.

Keep track of backlinks from other websites with link audit software. You may take reference of the following backlink analysing tools:

Ahrefs, Moz Open Site Explorer, Search engine journal In addition to this, there are several tools available to assist you in assessing the quality of your backlinks.

4. Primary focus on SEO and Quality content

There’s no denying that Google releases updated frequently, and you might not get notifications for each one. The easiest method to avoid penalised as a result of these adjustments always to ensure. It is that you are offering excellent content to your consumers. That is not centered on search engine algorithms or generated only to generate cash. Google has released webmaster recommendations that outline specific actions you can take to assist Google in better understanding your website. And how it is built, as well as practices to avoid. Reading Google’s Quality Rating Guidelines can also provide website owners with a decent sense of the sort of material Google is looking for.

Recovering From the Fred Update:

Surprisingly, not all webmasters noticed a significant drop in traffic once the Fred update went live. As Glenn Gabe of G-Squared Interactive highlighted in his write-up, certain sites witnessed significant spikes in traffic after the deployment, with some seeing rises of more than 100 percent.

The screenshots above are from one of Gabe’s clients’ Google Analytics data. Rather than seeing a drop in traffic, this website’s owner reported a 125 percent rise in visits overnight.

Google Fred Update traffic increase

What’s more fascinating is that this significant increase in traffic wasn’t the product of SEO work completed on or around March 7, 2017 – it was completed far before Fred was launched. As Schwartz remarked in his analysis, this shows that Fred wasn’t a major overnight change, but rather a fine-tuning of already-in-place algorithmic adjustments.

0 CommentsClose Comments

Leave a comment