Google now using BERT models to improve quality of search results

Google today announced that it will be improving search results using BERT technique. Bidirectional Encoder Representations from Transformers (BERT) is a neural network-based technique for natural language processing (NLP) pre-training. Instead of considering each word in a sentence, BERT will consider the full context of a word by looking at the words that come before and after it. Google provided a following example on how BERT will improve search results.

Here’s a search for “2019 brazil traveler to usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil.

Take a look at the image at the start of the article. Google can now understand “stand” as a physical activity and provide appropriate results. Google mentioned that BERT will help Search better understand one in 10 searches in the U.S. in English. Google will also bring BERT tech to more languages and regions in the coming months.

Source: Google

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.

Related
Comments