Google BERT Update: What It Is and How It Affects Search
Google BERT is a major update to Google’s search algorithm that was released in late 2019. It stands for “Bidirectional Encoder Representations from Transformers” and is designed to help Google better understand the natural language used in search queries.
Prior to BERT, Google relied on keyword matching to understand search queries, which could sometimes result in inaccurate or irrelevant search results. With BERT, Google is better able to understand the context and nuances of natural language queries, which should result in more accurate and relevant search results.
BERT is based on neural network technology, specifically the Transformer architecture, which was first introduced by Google in a 2017 research paper. Transformers are designed to handle sequential data, such as the words in a sentence, and are particularly effective at understanding the relationships between words in a sentence.
BERT is a “bidirectional” model, which means it can analyse a sentence both forwards and backward, allowing it to better understand the context of each word in the sentence. This is different from previous models, which could only analyse a sentence in one direction.
So, what does all of this mean for search? In short, BERT should help Google better understand natural language queries and deliver more accurate and relevant search results. This is especially true for longer, more complex search queries that may contain multiple phrases and ideas.
For example, consider the search query “2019 Brazil traveler to USA needs a visa”. Prior to BERT, Google may have struggled to understand the context of this query and could have returned irrelevant search results. With BERT, Google is able to better understand that this query is asking about visa requirements for a Brazilian traveler visiting the USA in 2019.
It’s worth noting that BERT is just one of many factors that Google uses to rank search results. While it should improve the accuracy and relevance of search results for certain types of queries, it won’t necessarily guarantee a higher ranking for any given webpage.
Overall, BERT represents a significant step forward in Google’s ability to understand natural language queries and deliver more accurate and relevant search results.