[email protected]

01788 288020

Office Addres:
Alma Park, Woodway Lane, Claybrooke Parva, Leicester, LE17 5BH

This week’s #searchdrop covers the initial findings of the May the 4th algorithm update which seems to have impacted thin content pages, local SERPs, aggregators and directories, as well as impacting SERPs features. We also look into what John Mueller has recently said about the BERT update and negative SEO.

Findings of the May the 4th Google algo update

Last week in #searchdrop we discussed the announcement of Google’s recent algorithm update and what to ensure you’re monitoring for any issues caused by it. You can see last weeks release here.

This week, we’re covering the initial findings of the algorithm as its rollout continues. Thanks to Roger Montti’s consolidation of findings from a range of agencies worldwide, we’re able to get a clearer picture of what is happening globally.

Effects of Google’s May the 4th update:

  • Thin content powered by external factors losing – this means that where a site has a page with poor quality content or that lacks enough content to be meaningful and useful to users, it is ranking but only on the value of its backlinks. These pages have seen a noticeable drop during this update, so if you’re seeing this happening to your site then it’s worth improving your content.
  • Local SERPs in major fluctuation – this is an interesting impact when you consider the current situation a lot of local businesses are going through. As the wording suggests, local search results are fluctuating. This is likely related to the point above, where local businesses have thin content pages propped up by age-old backlinks from SEO directories. The recommendation here is improve your site content and ensure you’re aligning your citations across recommended directories – which is a nice little segway to the next finding
  • Aggregators and directories winning – sites like this are always set to do well with Google’s algorithm as it stands, as they give the user much more information and choice. Therefore if you’re in an industry with these sort of sites, it’s worth considering how to improve your performance. Do you have a relationship with the site? Do you have good reviews? This is a technique called Barnacle SEO where you rank within a site that already ranks for a competitive term.
  • SERP features may be in fluctuation – this update seems to be focused around technology, health and food sites. It’s recommended that if you’re a site that relies heavy on SERPs features such as answer boxes etc then you should review your rankings for relevant terms and perform a SERP feature gap analysis to look for what the ‘newly’ ranking feature snippet has that you don’t.

The effects of the algorithm will continue to be felt for a few weeks longer before this updated algorithm becomes the new normal, so keep your eyes on your analytics and your bookmark to search console at the ready.

For the original story see here

John Mueller on BERT and negative SEO

John Mueller Webmaster Trends Analyst at Google has long been a connection between the SEO community and Google, offering advice and support where possible – and without giving out too much information on the algorithm!

Recently, John discussed two topics which we thought were of interest and worthy of mentioning in this weeks searchdrop – the BERT ‘update’ and negative SEO.

The BERT ‘update’

John recently discussed the BERT ‘update’ and although not covering the suspected link to searcher intent, John did describe the update as “a web of better understanding text. It’s not a ranking change in that sense.” This means that BERT works to understand the ‘context’ of your copy, which will relate to the intent of the search and the terms used.

John also said the following around the types of terms impacted by the BERT update. “Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, search will be able to understand the context of the words in your query.”

So if you think you were hit by BERT, it’s worth considering how you’re serving users that are searching with these more conversational search queries. Does your content suitably cover the topic enough to be a ‘catch all’ for all intents and phrasing for your chosen topic?

Negative SEO

This is a term used to describe the practice of attempting to get a competitors site penalised by Google by pointing poor quality links (often adult in nature) to their site. This has however been a long disputed tactic… Do people actually do it?

John Mueller doesn’t believe so (in most cases)  –“Usually the cases where I see that something around negative SEO is happening are kind of the cases where you would look at them manually. You would say, well this looks like maybe someone has built these links up over the past.”

However he does mention how Google’s algorithm is getting much better at understanding what links to ignore and when links are built unnaturally.

What can i do to protect against negative SEO?

It’s difficult to protect against negative SEO. You can however ensure the below is in place:

  • Ensure you have all the latest security features – be it security headers in your server or security plugins like Wordfence to protect against hacks or identify and blacklist spam traffic.
  • Regularly monitor your backlink profile – this can be done in search console or a whole range of SEO tools (Moz, SEMrush etc). What you want to look for is irrelevant backlinks with high spam scores or very low domain authority. Some of these links will be naturally gained, but if it’s an attempt at negative SEO then you’ll likely see adult themes and a very unnatural amount of links appear quickly.
  • Have a natural approach to link building yourself – although this won’t directly protect you, it will make it abundantly clear when something is out of place when reviewing yourself or even potentially in Google’s eyes.
  • If you need to use the disavow tool – if you think you’ve already been impacted, then consider using Google’s disavow tool to proactively tell Google to ignore specific links or domains currently linking to your site.