One of the more impressive announcements that Google made at I/O 2018 was “Lookout.” This app helps the visually impaired see by pointing their phone at objects and getting verbal feedback. Lookout for Android is now available for the Google Pixel.

Since I/O, Google has been testing and improving the quality of the app’s recognition results, which includes people, text, and objects. It leverages the same underlying AI technology as Google Lens to search and then provide spoken feedback, earcons, or other continuous signals to the user. Like Lens, it also works by pointing your phone’s rear camera at the world.

Users can select modes like “Explore” for new spaces, “Shopping” for barcodes and currencies, or “Quick read” to hear signs and labels. There is also a Camera view that provides live recognition.

We designed Lookout to work in situations where people might typically have to ask for help—like learning about a new space for the first time, reading text and documents, and completing daily routines such as cooking, cleaning and shopping.

Google recommends hanging your phone from a lanyard around your neck or placing it in a front shirt pocket to provide a live view of the world.

Once you’ve opened the Lookout app, all you have to do is keep your phone pointed forward. You won’t have to tap through any further buttons within the app, so you can focus on what you’re doing in the moment.

Lookout is available in the Play Store for U.S. Pixel, Pixel 2, or Pixel 3 phones running Android 8.0 Oreo and above. Google notes that it “will not always be 100 percent perfect,” and is looking for feedback. Over time, it will be available for more devices, countries, and platforms.

We’re very interested in hearing your feedback and learning about times when Lookout works well (and not so well) as we continue to improve the app. Send us feedback by contacting the Disability Support team at g.co/disabilitysupport.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news: