Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more
Google has announced that it is upgrading the Image Search Results on mobile in the USA with more information gleaned from its Knowledge Graph.
Information such as people, places or things related to the image from Google’s Knowledge Graph’s database of billions of facts with now be displayed below the image, allowing users to learn more about the picture and the information surrounding it.
For example, say you’re searching for beautiful state parks to visit nearby. You want to swim during your visit, so you tap on a picture of a park with a river. Beneath the photo, you might see related topics, such as the name of the river, or which city the park is in. If you tap a specific topic, it will expand and show you a short description of the person, place or thing it references, along with a link to learn more and other related topics for you to explore.
To generate the links to relevant Knowledge Graph entities, Google takes what they understand about the image through deep learning, which evaluates an image’s visual and text signals and combines it with Google’s understanding of the text on the image’s web page. This information helps them determine the most likely people, places or things relevant to a specific image. They match this with existing topics in the Knowledge Graph and then surface them in Google Images when they are confident they’ve found a match.
This feature will start to appear on some images of people, places and things in Google Images, when searched for on mobile in USA, and will expand to more images, languages and surfaces over time.