People have already asked Google Lens more than a billion questions about things they see. Lens taps into machine learning (ML), computer vision and tens of billions of facts in the Knowledge Graph to answer these questions. Now, google is evolving Lens to provide more visual answers to visual questions.
Say you’re at a restaurant, figuring out what to order. Lens can automatically highlight which dishes are popular--right on the physical menu.
When you tap on a dish, you can see what it actually looks like and what people are saying about it, thanks to photos and reviews from Google Maps.
Google Lens helps you decide what to order
To pull this off, Lens first has to identify all the dishes on the menu, looking for things like the font, style, size and color to differentiate dishes from descriptions. Next, it matches the dish names with the relevant photos and reviews for that restaurant in Google Maps.
Google Lens translates the text and puts it right on top of the original words
Lens can be particularly helpful when you’re in an unfamiliar place and you don’t know the language. Now, you can point your camera at text and Lens will automatically detect the language and overlay the translation right on top of the original words, in more than 100 languages.
Google is also working on other ways to connect helpful digital information to things in the physical world. For example, at the de Young Museum in San Francisco, you can use Lens to see hidden stories about the paintings, directly from the museum’s curators beginning next month. Or if you see a dish you’d like to cook in an upcoming issue of Bon Appetitmagazine, you’ll be able to point your camera at a recipe and have the page come to life and show you exactly how to make it.
See a recipe in Bon Appetit come to life with Google Lens
Bringing Lens to Google Go
More than 800 million adults worldwide struggle to read things like bus schedules or bank forms. So Google asked itself : “What if we used the camera to help people who struggle with reading?”
When you point your camera at text, Lens can now read it out loud to you. It highlights the words as they are spoken, so you can follow along and understand the full context of what you see. You can also tap on a specific word to search for it and learn its definition. This feature is launching first in Google Go, Google Search app for first-time smartphone users. Lens in Google Go is just over 100KB and works on phones that cost less than $50.
All these features in Google Search and Google Lens provide visual information to help you explore the world and get things done throughout your day by putting information and answers where they are most helpful—right on the world in front of you
Related post Google Search can do 20 Surprising Things !