Google Lens knows more about what’s in your photos than you doEnlarge
At Google I/O today, the company announced a new feature for Assistant and Photos called Google Lens. Lens will come in updates to the Google Assistant and Google Photos and will tell you more about what’s in front of your device’s camera, giving you contextual information and actionable options depending on what you’re looking at.

Google CEO Sundar Pichai revealed Google Lens early in the conference’s keynote, and it sounds much like technology in the original Google Glass. You can point your Android device’s camera at something, be it a flower, a restaurant, or a Wi-Fi network name and password, and Google Lens will give you more information about what you’re looking at.
In the demo, Google showed an image of a flower that Google Lens was able to immediately identify.
Later on in the keynote, another example showed Google Assistant using Lens’ technology to translate Japanese writing to English just by having the camera pointed at the Japanese sign.

Google Assistant will also provide contextual information using Lens: with your camera pointed at the marquee of a movie theater with the film you want to see, Assistant uses Lens to identify that movie and provide you different options to interact in real-life including buying tickets, adding the movie time to your calendar, and more.
Read 4 remaining paragraphs

Leave a Reply