Google now using BERT models to improve quality of search results
1 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Google today announced that it will be improving search results using BERT technique. Bidirectional Encoder Representations from Transformers (BERT) is a neural network-based technique for natural language processing (NLP) pre-training. Instead of considering each word in a sentence, BERT will consider the full context of a word by looking at the words that come before and after it. Google provided a following example on how BERT will improve search results.
Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil.
Take a look at the image at the start of the article. Google can now understand “stand” as a physical activity and provide appropriate results. Google mentioned that BERT will help Search better understand one in 10 searches in the U.S. in English. Google will also bring BERT tech to more languages and regions in the coming months.
Source: Google
User forum
0 messages