News and Insights

Roundup of Google updates from October 2020

November 18, 2020

We’re nearly halfway through the autumn and the pandemic bares its teeth once again. The virus continued its rampant spread and governments across the western hemisphere started reinstating local restrictions.

All omens showed an imminent second lockdown and peoples’ hopes for reopening the economy dashed to the ground.

For the most part, economies showed a surprising resilience and businesses a creative flair in adapting to the new reality. As a result, online migration and shipping have become some of the main objectives in business across the board.

The online retail boom translated in more new updates from Google at a hectic pace. The search engine deploys more sophisticated artificial intelligence to understand better not only what people are searching for but also the context around the queries.

What’s more, Google now highlights best shopping deals in search results maximising exposure for all retailers and helping shoppers to get the best deals. A key takeaway from this month’s updates is that more and more web page content will be eligible for ranking in the SERPs, increasing the competition and making the SEO work even more challenging.

Last but not least, Google found itself in the midst of a turmoil with the US Department of Justice taking the most significant actions against the tech company in 20 years on grounds of anti-competitive practices.

Without further ado lets deep dive into the October highlights in the world of Google.

Google news in Oct 2020 part 1

1st Oct – Google confirms two separate indexing outages

The month’s first day started with Google announcing two indexing issues in search results. One affecting mobile indexing and the other canonicalisation indexing issues.

Danny Sullivan via Google SearchLiaison on Twitter explained that there are two ongoing indexing issues impacting some URLs. One affecting mobile indexing and the other canonicalisation, which has to do with how duplicate content is being detected and handled.

https://twitter.com/searchliaison/status/1311805474992320512?s=20

This tweet was effectively the confirmation of multiple tweets addressing the issue over the course of that week. There have been various mentions referring to dropped pages disappearing from search results.

One of the many examples is one a user provided John Mueller with when the latter asked for a dropped URL example.

John Mueller, tweet

So, what are the mobile and canonicalisation indexing issues Google had to deal with?

As per the mobile indexing issues, Google faced some issues in ranking and indexing primarily the mobile version of web pages. Canonicalisation indexing issues appeared to be problems Google had in handling pages that were either cited or duplicated by other pages resulting in the original page not showing in the search results.

After a whole week of fuss about this topic within the publishing and the SEO community, Google stepped in acknowledging the problem and took actions to its resolution.

8th Oct – Public Liaison for Search releases explainer on Google autocomplete predictions

Danny Sullivan posts an explainer on Public Liaison for Search blog about how Google autocomplete predictions are actually generated.

All of us who use Google as a search engine are familiar with autocomplete. This is the predictions appearing as soon as we start typing the first words or even letters into the Google search bar. This functionality is a very effective way to speed up the way we search for information online.

In the past Google has explained how autocomplete works but this time it was all about how the predictions are being generated.

Danny Sullivan explains that whilst autocomplete is helpful, sometimes Google is ready to take actions in situations when prediction is not the best solution.

But where do predictions really come from? Sullivan explains that autocomplete predictions are searches that have been previously done on Google.

It turns out that Google predictions are based on 4 factors: Trending queries, language of the searcher, location of the searcher and freshness. At the most basic level predictions work based mainly on trending queries, however in order to be more relevant, the searcher’s language and location are also considered.

Here’s an example of two different predictions for “driving test” based on trending queries, language and location of searchers in California (USA) and Ontario (Canada) respectively:

two different predictions for "driving test" based on trending queries.

Google can predict parts of a query but a whole query as well. This occurs when searches are less common and the trending queries factor is not enough for the most relevant predictions to be provided.

Google may also prioritsise freshness if there is a rising interest in a certain topic.

Sullivan’s example itself is illuminating enough to make his point across.

For example, searches for a basketball team are probably more common than individual games. However, if that team just won a big face-off against a rival, timely game-related predictions may be more useful for those seeking information that’s relevant in that moment.

Danny Sullivan, Public Liaison for Search

Sullivan didn’t neglect to highlight that autocomplete is not only designed to display the most common queries on a given topic. Making clear that it differs and should not be compared to Google Trends, where search popularity is the predominant factor.

When predictions are not displayed?

There are cases where predictions are avoided due to chances of unexpected or shocking autocomplete suggestions that can lead to unreliable content.

Google deal with this in two ways: Automatically and manually.

Automatically; systems designed to prevent potentially unhelpful and policy-violating predictions from appearing.

Manually; when Google’s automated systems miss predictions that violate its policies, there are enforcement teams in place to remove predictions according to those polices.

Sexually explicit, hateful, disparaging or dangerous terms and phrases will not be provided in the form of autocomplete suggestions.

On top of that, Google prevents autocomplete suggestions that can return unreliable content and queries than can are interpreted as confirmations of unconfirmed rumours.

Having said that, Danny Sullivan makes clear that although Google may prevent autocomplete predictions in certain situations, users are free to search for anything they wish.

9th Oct – Google’s John Mueller on copying top ranked sites

Once again, a Twitter user asked a question, and this sparked another insightful discussion involving John Muller.

A person asked John Muller on Twitter if copying the way Amazon has set up their homepage menu would be beneficial for their site as well.

Response to John Mueller's tweet

John Mueller replied:

Unless you’re Amazon, I wouldn’t assume that you can just reuse the same thing and you’ll rank like Amazon. Large sites do good and bad things, they don’t apply to all other sites.

John Mueller, Senior Webmaster Trends Analyst at Google

The person then went on saying that their site has 20M+ URLs and that they would like to know if such a strategy would help user experience and the bot in the same time.

You can make this kind of on-site navigation work for search, or not work for search, depending on how you implement it. I would not blindly implement it like Amazon, but rather work out what *you* need for *your particular site*, both for users and for SEO.

John Mueller, Senior Webmaster Trends Analyst at Google

9th Oct – US Government report accuses Google of anti-competitive practices

A 450-page report has been released by the House Antitrust Subcommittee accusing Google of employing business practices and demonstrating anti-competitive behaviour. The report suggests solutions on how to contain Google and restore competition mentioning also search results.

The Antitrust Subcommittee casts clear allegations of monopoly.

Google has a monopoly in the markets for general online search and search advertising.

House Antitrust Subcommittee

The report outlines Google’s strategies to achieve and maintain monopoly such as browser and mobile device default settings that ensures Google preference over competitors, forcing mobile phone manufacturers to install Google’s apps by default and leveraging acquisitions as a way to achieve dominance in the market.

Google received severe allegations of anti-competitive tactics. Amongst others Google was accused of using third party content to benefit its own “inferior vertical offerings”, imposing penalties on competing verticals for the benefit of its verticals, stuffing ads in search results, making organic results difficult to distinguish.

The House Subcommittee provided a fair number of suggestions for restoring competition and the prevention of market dominance from digital companies.

Some of which were:

  • Structural separations and prohibitions of certain dominant platforms from operating in adjacent lines of business;

  • Nondiscrimination requirements, prohibiting dominant platforms from engaging in self preferencing, and requiring them to offer equal terms for equal products and services;

  • Interoperability and data portability, requiring dominant platforms to make their services compatible with various networks and to make content and information easily portable between them;

  • Presumptive prohibition against future mergers and acquisitions by the dominant platforms;

  • Safe harbor for news publishers in order to safeguard a free and diverse press; and

  • Prohibitions on abuses of superior bargaining power, proscribing dominant platforms from engaging in contracting practices that derive from their dominant market position, and requirement of due process protections for individuals and businesses dependent on the dominant platforms.

As this report can potentially bring about legal implications and changes with regards to competition as well as merger and acquisition practices, it is highly likely that its findings will impact Google SERPs.

A good reason to assume something like that is section 2 of the report:

The Subcommittee recommends that Congress consider whether making a design change that excludes competitors or otherwise undermines competition should be a violation of Section 2, regardless of whether the design change can be justified as an improvement for consumers.

House Antitrust Subcommittee

Self-preferencing and favouritism are being addressed in section 2 as some of the practices Google has allegedly employed for its search results design.

Given these developments, we might see Google changing the way results are rendered and are less in favor of the technology giant’s interests.

14th OctA confidential Google Chat Incident Report leak about how outages happen

A Google Incident Report labeled as confidential leaked and gave us a unique opportunity to see firsthand what can cause a system outage at Google. This document gave us a peek of how things can go wrong inside Google’s highly complex systems.

In the backdrop of this incident the Google Chat outage preceded various indexing issues.

A roll out communication (Google Cloud Issue Summary) later followed explaining in detail:

ROOT CAUSE

The Google Chat backends utilize a number of pre-processing functions prior to processing an incoming request. These pre-processors perform a number of calls to different services (such as Google’s internal Identity service) and store these results in a local cache.

One of these preprocessors had been encountering an access error due to an incorrectly configured backend request, which prevented it from successfully completing. This error initially did not cause any further issues.

Google Cloud Issue Summary

It turned out that the internal engineering team was not aware that the major outage was triggered by a pre-existing error after an update on September the 17th, 2020.

In effect the September update talked about a post-processor that was looking to find an output from a preprocessor and since the output was not in place another error occurred ultimately causing an outage, i.e the Google Chat outage.

Consequently, Google had to release a new update to make up for the previous undetected error.

On September 17th, a new release of the Google Chat backend was deployed. This release included a change that required a post-processor to have access to the results of the failed preprocessor above.

However, as this preprocessor aborted it’s processing due to the access error, the cache was never populated. Initially, this post-processor attempted to retrieve the required value, but because the cache did not contain the value required, this spawned a new thread that attempted to retrieve the value, but had a dependency on the post-processor that was holding a lock.

This created a deadlock condition that was unable to be completed. This deadlock caused the backend binary tasks to experience high thread lock contention, which ultimately led to application errors.

Google Cloud Issue Summary

In the same report Google concluded:

To prevent the recurrence of this issue and reduce the impact of similar events, the following actions are being taken:

● Adjusting the automated alerting system to improve the detection of lock contention issues..
● Increasing the number of threads available to Google Chat backend services in order to reduce the potential impact of lock contention events.
● Defining new testing which triggers this particular code path and identify this issue before reaching production.

Google Cloud Issue Summary

The lesson we learned out of this incident is that even a minor glitch into Google’s services can cause an outage bringing about a great deal of disruption for both Google and its users.

14th Oct – Google “Desktop only sites will be completely dropped from the index from March 2021”

Google’s John Muller discloses valuable insights about the infamous Mobile FIrst Index. The striking news is that desktop only sites will not be indexed from March 2021 onwards. The desktop content will be ignored.

Muller also added that there are bugs affecting m-dot sites (websites that are designed solely for mobile devices) and made clear that the mobile only index starting date is unlikely to change.

In a marketing conference John Muller through his keynote announced that March 2021 will the month when full mobile index is scheduled to kick off. He went on saying that desktop only sites, together with their images or any other content assets will be dropped from the index.

We’re now almost completely indexing the web using a smartphone Googlebot, which matches a lot more what users would actually see when they search.

And one of the things that we noticed that people are still often confused about is with regards to, like if I only have something on desktop, surely Google will still see that and it will also take into account the mobile content.

But actually, it is the case that we will only index the mobile content in the future.

So when a site is shifted over to mobile first indexing, we will drop everything that’s only on the desktop site. We will essentially ignore that. And the final deadline we’ve come up with is March 2021.

John Mueller, Senior Webmaster Trends Analyst at Google

Quite interestingly, John said that in some cases, such as m-dot sites that use hreflang attributes, Google will not be able to send desktop users from the search engine results to the desktop version. In cases as such, Google will be sending desktop users to the m-dot mobile version.

When it comes to mobile-first indexing, this makes things a lot trickier. So we can process these but what will happen is we will only index the m-dot version of the site and it can happen that we show the m-dot version of the site in the desktop search results.

Usually we try to show the appropriate version, desktop or mobile version, in the search results, the URL at least. The indexed content is… only the mobile version.

But with m-dot sites it can sometimes happen that we just have the m-dot version where we didn’t actually pick up that there’s a connection to a desktop version here. This is a lot more likely if you have a m-dot version and use an hreflang.

John Mueller, Senior Webmaster Trends Analyst at Google

Google news in Oct 2020 part 2

15th Oct – Google announcement about how AI empowers a better search

Google shared information about important changes on how AI will impact site ranking, SEO and publishing. The upcoming changes will impact 7 – 10% of searches and the changes on the so called BERT will impact 100% of the searches when last year the percentage was only 10%.

From an SEO standpoint, our work will most probably get harder as the new algorithm improvements will create more diversity on the search results. What we extracted from this Google’s publication is that BERT is applied in every single search query.

Today we’re excited to share that BERT is now used in almost every query in English, helping you get higher quality results for your questions.

Jacob Devlin, Research Scientist

Were you wondering what BERT is?

BERT stands for Bidirectional Encoder Representations from Transformers. In plain language is a technique of natural language processing (NLP) pre-training that helps the search engine understand the context around the words that are being searched for. According to Google BERT supports the search engine to identify the search query intent.

According to Google BERT will be involved in virtually every search query in English language moving forward.

Google also announced algorithm spelling improvements that are the biggest Google has seen in five years. The innovation this algorithm spelling improvement brings to the table is that the search engine will be able to better determine the context of the misspelled words.

Indexed passages. The most impactful change Google is introduced is indexing passages. Google will now be indexing individual passages in a web page and not just the web page itself. Now passages will rank when returning a search query and this will impact 7% of all search queries.

The new capability will allow Google to effectively spotlight pages with content that bears the answer to a query.

Here is a good example from Google ranking a passage:

Example of Google ranking a passage

Another way the new Google updates will dramatically impact search queries is the subtopics improvements.

People who make broad and generic queries like “home exercise equipment” can have scores of different things they are looking to find. Are they looking for budget equipment, maybe premium options or small space ideas?

We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad.As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page. We’ll start rolling this out by the end of this year.

Jacob Devlin, Research Scientist

The new implementation will kick off by the end of 2020and will bring greater diversity in search results for broad queries.

How will this impact search SERPs?

Subtopic ranking will make high volume keywords more difficult to rank for but will benefit the ones who go after specific subtopics and would normally never rank for higher volume subtopics.

So, If your business competes for a broad keyword/phrase do consider doing your SEO on all your subtopic pages.

Inevitably video content will play a crucial role within these algorithm changes. According to Google 10% of searches will be impacted by video related updates. It seems that SEO experts will have to consider adding video content in in their publications.

Here, the previously mentioned passages concept apply only difference is that it will be about videos. Google will analyse all video, tag each video section that best describes its content and then return it to the searcher’s query in the form of an answer.

It goes without saying that this will be a great shift in the way video content is produced, planned and how the algorithm will be able to understand all the key moments in videos.

Using a new AI-driven approach, we’re now able to understand the deep semantics of a video and automatically identify key moments. This lets us tag those moments in the video, so you can navigate them like chapters in a book.

Whether you’re looking for that one step in a recipe tutorial, or the game-winning home run in a highlights reel, you can easily find those moments. We’ve started testing this technology this year, and by the end of 2020 we expect that 10 percent of searches on Google will use this new technology.

Google

Bottom line of these changes is that there will undoubtedly be a significant impact in search optimisation. SERPs competition will become more severe as more URLs will compete for the search rank.

SEO has a long way ahead and it will be definitely not all roses.

16th Oct – Google tell us if a site section can affect ranking score of the entire site

John Muller shared insights about if a site section can negatively impact the ranking score of the whole site through the SEO office-hours platform on Youtube.

On that Friday’s session, John was asked whether poor ranking elements of one site section can affect the overall ranking performance of the entire site. By answering that question, Mueller took the opportunity to share valuable insights on how Google handles ranking factors towards the entire site, section of a site and a granular approach.

The exact question asked was if poor core web vitals score at a specific section of an otherwise exceptional core web vitals site will impact its rankings when given that the Cοre Web Vitals will be a ranking factor in 2021. 

https://www.youtube.com/watch?v=JV7egfF29pI&ab_channel=GoogleWebmasters

Regarding the question on how site sections can impact sitewide ranking John Mueller said:

So in general, with our algorithms we try to be as fine grained as possible. So if we can get granular information for your site and recognize the individual parts of your website properly then we will try to do that.

John Mueller, Senior Webmaster Trends Analyst at Google

Mueller explained that Google takes a more granular approach when trying to rank web pages and sections of websites.

Mueller went on saying that when a site makes it easy for Google to the content of the different sections of a site, then Google will be able to figure out that these different sections are not related to each other in any way thus will be able to rank them in a better and more accurate way.

So depending on how much data is available there and how easily it is for us to figure out which parts of your site are separate, and then that’s something that we can do easier or that is a little bit harder.

And you can do that with things like making sure you have a clean sub-directory structure on your site or using subdomains if that makes sense for your website.

In your case it would be for example to split out the forum from… site where we an tell oh, slash forum is everything forum and it’s kind of slow and everything else that’s not in slash forum is really fast- if we can recognize that fairly easily that’s a lot easier.

Then we can really say everything here in such forum is kind of slow, everything here is kind of okay.

John Mueller, Senior Webmaster Trends Analyst at Google

Later on, replying to the respective question Mueller confirmed that a flat site architecture would not be a good idea. The idea behind a flat site architecture is when all web pages of a site equally share the benefit the PageRank directed to the homepage from links.

On the other hand if we have to do this on a per URL basis where the URL structure is really like we can’t tell based on the URL if this is a part of the forum or part of the rest of your site, then we can’t really group that into parts of your website.

And then we’ll kind of be forced to take an aggregate score across your whole site and apply that appropriately.

I suspect we’ll have a little bit more information on this as we get closer to announcing or… closer to the date when we start using Core Web Vitals in search.

John Mueller, Senior Webmaster Trends Analyst at Google

John also suggested using Google Search Console to check how URLs affect the rest of the site. 

But it is something you can look at already a little bit in search console. There’s a Core Web Vitals report there and if you drill down to individual issues you’ll also see this URL affects so many similar URLs.

John Mueller, Senior Webmaster Trends Analyst at Google

What we can take away from Mueller’s answers is that; site architecture can help Google determine the nature of a site’s different parts, flat site architecture scheme can negatively impact rankings, and sites that help Google understand easier their different sections will not be impacted from Core Web Vitals updates.

22th Oct – Google will highlight best product deals in search results

Google introduces updates helping retailers by highlighting their promotions in search results and giving them valuable product exposure. Google announced that best shopping deals will be highlighted in search results.

This update is a win win solution for both retailers and deal seekers.. The latter will get the best available deals whilst the retailers will gain exposure for their best offers. The updates always concern Google’s free shopping listing kicked off earlier this year and it is automatically enabled to all eligible US retailers.

Retailers can edit promotions via the Merchant Center and the approval will take no more than a few hours. The new Google shopping capability will better meet shoppers demands with things like flash sales or extending an existing promotional period.

Promotions will show up in various places all across Google including also annotations similar to the ones you can see in following example as highlighted by the Search Engine Journal:

Google ads promotion and annotations

Google will showcase all best product offers in

  • Ads in Google Shopping
  • Free listings in Google Shopping
  • Google Images (mobile only)
  • Local inventory ads in search results

As expected all new features will be extra user friendly and shoppers won’t have to struggle to find the best deals as the new “on sale” filter will be available in the Shopping tab for both mobile and desktop

Shopping tab "on sale" feature on Google

Shoppers who search for best deals will be able to see who is the retailer offering the on-sale items via the product knowledge panel. Google made clear that the new updates will be available for all US retailers regardless of whether they advertise with them or not.

Amongst others, Google’s shopping listings announcements gave us some interesting statistics that people who look to sell on Google might be interested.

  • 58% of US holiday shoppers will hold off on buying gift items until they’re on-sale
  • 40% of global shoppers say they turn to Google to find the best deal
  • 58% of US holiday shoppers bought from at least one brand last holiday season that they hadn’t before
  • 65% of US holiday shoppers are starting to shop earlier to avoid out-of-stock items