What then could be the implications for affiliates? We asked a leading content publisher on the network for their thoughts including some practical ways they tackled the unforeseen arrival of ‘Fred’.
How did ‘Fred’ affect the performance of your website?
Our website has been running for more than five years and attracts the majority of its traffic from Google. During this time, we have been growing our traffic by publishing and curating content, including both editorial and user-generated content.
On the 8th March 2017, we suddenly lost 70% of our traffic from Google overnight. Our analytics and Google Search Console data indicated this was a site-wide drop affecting multiple pages and keywords. Our rankings dropped by a page or more on most of the search terms we track and disappeared altogether on some of the most competitive keywords.
Unfortunately, Google never officially confirmed the ‘Fred’ update and there was a lot of conflicting information in the aftermath of the roll-out. Originally, many people suggested the update was related to backlinks, which led to an initial scramble to review all of our links for quality. However, it later became clear that the update affected websites with a focus on ‘aggressive monetisation over the user experience’.
What action did you take to recover site traffic and revenue?
Having reviewed many other websites also affected by ‘Fred’, we devised a two-pronged strategy to recover our rankings:
Full technical crawl of website
The nature of the update indicated to us that Google may have developed an algorithm to detect affiliate links, including cloaked affiliate links via robots.txt. Google SEO spokesman, Gary Illyes, tweeted: ‘DYK there's no inherent problem with affiliate links? The problem is when a site's sole purpose is to be a shallow container for aff links’. This suggested to us that the update was related to issues of monetisation and content quality.
To understand how our website might be perceived in the eyes of Google’s algorithm, we undertook a full crawl of our website, specifically focusing on the number of pages, the amount of content on each page and outgoing links from that content.
Redesign of price comparison pages/widgets
One of the most striking findings from the crawl of our website was that our price comparison widgets were creating a huge number of affiliate links. Due to the way the underlying HTML code was structured, up to seven affiliate links were created for every featured deal. With certain pages featuring more than one hundred deals, those pages were creating more than a thousand affiliate links.
To reduce the number of affiliate links, we decided to redesign our price comparison tables. This involved limiting the number of hyperlinks to just one per deal and also streamlining the number of deals shown on each page. This one change reduced the number of affiliate links across our entire website to just one-third of the original amount.
Implementing rel=”nofollow” on all affiliate links
We already cloak our affiliate links via robots.txt both for ease of management and to ensure that no PageRank is passed through the link (this is typically recommended as best practice for affiliate sites). With the onset of ‘Fred’, we also implemented a rel=’nofollow’ on all of our outgoing affiliate links. This helped us to make sure we remained fully compliant with Google’s webmaster guidelines.
Review of content and user experience
Full page-by-page content audit
Next, we began a full content audit of every page uncovered in the technical crawl of our website. In this manual review, we were asking ourselves how useful that content might be if someone landed on it today. This was possibly the most time-consuming part of the journey as it required us to go back and review every piece of content we had ever published.
From the audit, we discovered that 43% of our pages were no longer relevant to users today. For instance, they might include old blog posts discussing an expired deal or reviews of a product no longer available to buy. Although most of the deals were clearly marked as ‘old’ or ‘expired’, it would have been a frustrating/confusing experience for users landing on those pages today. Also, our analytics software indicated that these pages received very little traffic (less than 1% since the start of the year). We therefore decided to remove all of those pages, setting up 301 redirects to a newer and more relevant page.
This refinement has resulted in 100% evergreen/useful content for our readers.
Finally, we discovered lots of old user-generated content, including comments on blog posts. Some of these backdated multiple years and were of questionable relevance to users today. They were also causing our pages to load more slowly due to the increasing amount of data that needed to be downloaded.
We amended our forum/comment capacity to ensure optimum relevance and maximum site performance/speed. Our readers now have immediate access to the most useful and recent comments, however, users will now need to click a button to view the rest of the comments.
What impact have these changes made to your site performance?
One month after we implemented the above changes, we witnessed a 75% recovery in our website rankings. Since then, our website traffic has continued to increase, and at two months we are now receiving 90% of original website traffic.
Although we are still without official confirmation of the ‘Fred’ update from Google, we recommend undertaking a full crawl and audit of your website (including price comparison widgets and other monetised pages). This may help to identify pages that are ‘over-monetised’ in the eyes of Google and may highlight opportunities to streamline the consumer journey elsewhere.
For further information regarding the Google ‘Fred’ update and how the recent changes may have affected your site performance, please take a look at the Awin blog.