This week’s #searchdrop looks at Google’s mobile indexing becoming mobile-only indexing, the temporary removal of the ‘request indexing tool’ and Google’s announcements from Search 2020
Yesterday Google’s Senior Webmaster Trends Analyst, John Mueller, took to Pubcon Pro Virtual as a preliminary keynote speaker where he discussed changes within SEO and the Google algorithm that we can expect to see in the following year. As well as offering advice to small businesses, discussing how you should test your website and the importance of speed metrics, John also touched on the changes which are going to be made to mobile-first indexing.
Four years ago, Google announced that from then on the mobile version of each page would be the focus when it comes to indexing. They suggested at the time that both the mobile and desktop version of a page should mirror each other in terms of aspects including content, schema, links, reviews and comments.
To begin with, SEOs believed that Google would continue to crawl the desktop version of pages alongside the mobile version but in fact, unless they are checking for spam and manipulation, Google no longer crawls desktop pages.
Bearing this in mind, not a lot is really going to change. The search engine giant was meant to be completely moving over to mobile-only indexing in September 2020 but this deadline has now been pushed back to March 2021. After this date, websites with both desktop and mobile pages will only have their mobile version indexed. Websites that are solely made for desktops will continue to be crawled by the mobile crawler.
The key to making sure your website isn’t hit by mobile-only indexing negatively is to focus on mobile parity. By making your mobile website the mirror image of your desktop site, both will be equally optimised and will more likely rank higher. This includes images, blog posts, content and schema but also small aspects you may forget. Areas that you may miss when it comes to mobile parity include:
If you’re not sure where to even start when it comes to ensuring you achieve this, you can perform a mobile/desktop parity audit. Moz has written a great article which guides you through the process of a mobile parity audit.
You can find out more about John Mueller’s speech on Pubcon Pro Virtual by reading this article on Search Engine Roundtable.
Last week in our Search Drop we reported issues that related to the bug which caused mobile-indexing and canonicalisation issues in Google. Many website owners reported that some of their pages had been de-indexed or have never been indexed properly by Google and they did not know why.
In the past weeks, Google Webmasters’ twitter account was providing us with daily updates on how many pages and results have been retrieved and how many still need to be fixed.
On the 14th of October, Google Webmasters Twitter account released information that the manual ‘request indexing’ feature has been temporarily disabled in order to make some infrastructure changes. Additionally, Google Webmasters provided us with a link to this developer document, which explains in more detail not only how Google crawls websites but also how the process of page indexation looks like. This announcement caused a large number of questions and concerns from the publicity as infrastructure changes were confirmed just after when Google’s team fixed the mobile-indexing and canonicalisation related issues.
This was a moment when John Mueller, Webmaster Trends Analyst at Google, clarified that normal crawling and indexing will not be affected by the infrastructure improvements and in most cases, sites don’t need to use the ‘request indexing’ feature in the Google Search Console.
In fact, we couldn’t agree more with John. In normal situations when Google indexing processes are working without interruptions, the only action that needs to be taken is to provide Google with the XML sitemap which includes any recent modifications to notify them about content changes ready to be re-indexed.
Barry Schwartz from Search Engine Land mentioned in his blog, however, that the main reason why we as SEO specialists care about this feature is that in the past weeks when many websites were unable to get their new content indexed properly, we used the ‘request indexing’ feature to speed up the process of indexation. Barry also speculated that this might have caused a hard time for Google as the potential number of manual requests could be overwhelming and difficult to stay on top of. He also added that we might not be aware of any other unfixed bugs related to the indexing process and this might be the reason why the feature was temporarily disabled.
In short, if your page or whole website is not getting indexed properly, we are unable to push content through the Search Console ‘request indexing’ feature.
This week was Google’s virtual Search On 2020 event. At this event, Google outlines some of the work it’s done throughout the year. One of the key messages was that BERT handles almost all English language queries. Although this doesn’t directly impact rankings of sites specifically it does improve the relevancy of search results based on what’s required from the search query by better understand the results.
There were a host of other highlights from the event that are important for search in a couple of different ways:
Why is this so important? All of these new and advanced aspects of search can be leveraged to improve your site’s performance. Some of the above you’ll need to work towards through optimisation and investment in new content, other aspects will benefit the sites that ensure they’re looking after the user in the first place.
For more information on this checkout the Search on 2020 article.