Google Assistant’s smarts are coming to the camera. Today at I/O 2017, CEO Sundar Pichai announced that Assistant will soon be able to analyze what a smartphone camera is pointed at and provide relevant content.

Point it at a flower, and Google will identify the type. If you point it at a concert poster, Assistant will suggest buying tickets, watching a song on YouTube, searching for the artist. Point it at a restaurant and you’ll instantly see reviews. You get the idea.

With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17 pic.twitter.com/viOmWFjqk1 — Google (@Google) May 17, 2017

My favorite example Google showed was for Wi-Fi: just point your camera at login credentials, and your Android device will automatically sign in with them.

This feature isn’t altogether new to Google; the company’s Google Translate app uses similar technology to automatically translate words that are inside the camera’s viewfinder. Behind the scenes, Google is relying on what it calls Google Lens technology to make Assistant’s new feature work. Google Lens is also coming to the Google Photos app.

Samsung has recently attempted to offer similar functionality with its Bixby assistant. For now, that usefulness is mostly limited to finding similar images or shopping for items on Amazon.

There's much more Google news out of I/O to come — keep an eye on our live blog!