The application has four modes to choose from: Home, Work & Play, Scan or Experimental. They tell the app which elements to focus on -- Home, for instance, can tell users where the furniture are if they're doing chores or cleaning the house and need to move things away from their usual places. Work & Play can tell them where the office elevator is, or the various tools they need for their job like in the video above, wherein the app tells the user where her scissors are. Users can also try out beta features through Experimental mode. The app will use machine learning to figure out what visually impaired users deem important and worth hearing about. So, the more people use it, the better it becomes.

In its announcement, Google has recommended wearing the Pixel phone with the app installed in a lanyard around users' necks (or putting it in a shirt pocket) with the camera facing the world. From the sound of things, it's going to be a Pixel-exclusive upon launch, just like Google's AI-powered image recognition tool Lens. The company eventually made Lens available on iOS and other Android phones, though, so there's always a chance for Lookout to make its way to other devices. Before any of that happens, we first have to wait for the app to arrive on the Play Store. Google didn't mention a specific date and only said that it'll be available sometime this year.

Click here to catch up on the latest news from Google I/O 2018!