When we introduced the new Pixel 2 last month, we talked about how Google Lens builds on Google’s advancements in computer vision and machine learning. When you combine that with the Google Assistant, which is built on many of the same technologies, you can get quick help with what you see. That means that you can learn more about what’s in front of you—in real time—by selecting the Google Lens icon and tapping on what you’re interested in.

Here are the key ways your Assistant and Google Lens can help you today:





Text : Save information from business cards, follow URLs, call phone numbers and navigate to addresses.

: Save information from business cards, follow URLs, call phone numbers and navigate to addresses. Landmarks : Explore a new city like a pro with your Assistant to help you recognize landmarks and learn about their history.

: Explore a new city like a pro with your Assistant to help you recognize landmarks and learn about their history. Art, books and movies : Learn more about a movie, from the trailer to reviews, right from the poster. Look up a book to see the rating and a short synopsis. Become a museum guru by quickly looking up an artist’s info and more. You can even add events, like the movie release date or gallery opening, to your calendar right from Google Lens.

: Learn more about a movie, from the trailer to reviews, right from the poster. Look up a book to see the rating and a short synopsis. Become a museum guru by quickly looking up an artist’s info and more. You can even add events, like the movie release date or gallery opening, to your calendar right from Google Lens. Barcodes: Quickly look up products by barcode, or scan QR codes, all with your Assistant.

Google Lens in the Assistant will be rolling out to all Pixel phones set to English in the U.S., U.K., Australia, Canada, India and Singapore over the coming weeks. Once you get the update, go to your Google Assistant on your phone and tap the Google Lens icon in the bottom right corner.