Google search BERT

Meet BERT – Google’s latest search algorithm

The BERT algorithm – which stands for Bidirectional Encoder Representations from Transformers, to give it its full glorious name – is search giant Google’s newest significant search update. It’s set to affect around one in 10 searches, and its ultimate aim is to help the corporation to better understand the intent behind search queries.

For the user, the idea is that results become more tightly tailored – the Holy Grail of relevance that Google is always seeking. Rollout began near the end of October 2019, but the search engine open-sourced it nearly a year earlier, in November 2018, so anyone can use it to train their own language-processing system.

Google has described BERT in somewhat dramatic terms as ‘one of the biggest leaps forward in the history of search.’ We couldn’t possibly comment! But it is likely to affect the organic visibility of your brand in some way, even if you don’t actually notice it.

At the moment, Google is using BERT to further its understanding of roughly a tenth of all searches made in English in the States.

So what is it?

Put in a complicated way, BERT is a neural network-based method of natural language processing pre-training. Expressed more simply, Google can use this algorithm to help understand the context of words in search terms more clearly.

For example, the word ‘to’ could have different meanings in different expressions, which may be obvious to us human beings, but not to search engines. BERT aims to find more relevant search results by distinguishing between such linguistic subtleties.

How does it work?

BERT’s key selling point is that it can train language models based on a whole group of words on a sentence, so that word context is mastered based on surrounding words, and not just the one immediately before or after it.

That means Google appreciates the difference between, for example, bank account, and bank of the river.

In a statement, the company said:

“Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, search will be able to understand the context of the words in your query.”

BERT may sound similar to RankBrain, Google’s earlier AI technique for understanding queries. But it’s actually a distinct algorithm, so the two should not be confused. Equally, BERT isn’t there to replace RankBrain, nor will it operate in isolation from other tools.

While BERT will mainly affect search, there’s also likely to be some impact on Google Assistant. Equally, if it becomes integrated in the future in Google Ads, it could actually assist advertisers by helping to end ambiguity.

Do I need to do anything about BERT?

In short, no. Google continues to advise website owners to create content which gives the user what they want and satisfies their intent. If you’d like to know more about BERT or content creation, talk to us at Front Page today.