Google now using BERT models to improve quality of search results

Reading time icon 1 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Google today announced that it will be improving search results using BERT technique. Bidirectional Encoder Representations from Transformers (BERT) is a neural network-based technique for natural language processing (NLP) pre-training. Instead of considering each word in a sentence, BERT will consider the full context of a word by looking at the words that come before and after it. Google provided a following example on how BERT will improve search results.

Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil.

Take a look at the image at the start of the article. Google can now understand “stand” as a physical activity and provide appropriate results. Google mentioned that BERT will help Search better understand one in 10 searches in the U.S. in English. Google will also bring BERT tech to more languages and regions in the coming months.

Source: Google

More about the topics: BERT, google, Google Search

Leave a Reply

Your email address will not be published. Required fields are marked *