Google will soon be extending the reach of Google Lens, its visual search interface. In a blog post, the company announced Lens would be integrated into the Google Assistant in the coming weeks. The feature is still exclusive to Pixel phones, but now it should be a lot easier to access.

Google Lens came out in beta on the Google Pixel 2 , which launched last month. The service is basically a revamp of Google Goggles—you take a picture of something, run it through Google's computer vision algorithms, and Google will try to tell you what's in the picture. Google says Lens can identify text, landmarks, and media covers, but those were all things Goggles could do years ago. We tried Lens on the Pixel 2 at launch, and while it was definitely a beta with a lot of problems, it occasionally did something impressive, like recognizing not just that a picture contained a dog, but also nailing the dog breed.

Google says Assistant integration will allow you to get "quick help with what you see." This sounds like a big improvement over the current beta of Google Lens, which is only integrated into Google Photos. Doing any kind of recognition through the Photos app is really slow, since you have to open the camera app, aim it at something, take a picture, open the picture, and then run it through Lens. The new location of Lens will be a lot easier—you just open the Assistant and tap on the Lens icon in the bottom right corner.

Google says the Lens-in-Assistant integration will be coming to "Pixel phones set to English in the US, UK, Australia, Canada, India and Singapore over the coming weeks."