Today at Google I/O 2019, Google has announced several new additions to Google Lens, its image recognition software that’s available as an app and is built into the cameras of Google Pixel devices.

The biggest new addition is better support for Lens in restaurants. You can point your phone’s camera at a menu, and Lens will automatically highlight popular dishes at the restaurant, and selecting individual dishes will show you photos and reviews from Google Maps. Then, when it comes time to pay, you can point your camera at the bill and Google Lens will bring up a menu to help you calculate a tip and split the bill.

Today's special: Google Lens. ️ Automatically highlighting what's popular on a menu, when you tap on a dish you can see what it looks like and what people are saying about it, thanks to photos and reviews from @googlemaps. #io19 pic.twitter.com/5PcDsj1VuQ — Google (@Google) May 7, 2019

If you’re more interested in cooking food at home than eating out, then Google Lens can also help you with recipes in Bon Appétit magazine, where it will be able to add instructional videos to help you understand how to cook a dish.

Google Lens is also now capable of reading out text in addition to capturing it and translating it. A short video showed how this functionality could be used to help people who can’t read understand signs and computer interfaces. The feature is coming first to Google Go, the operating system meant for low-powered devices, where it will take up just over 100kb of space.

Launching first in Google Go, our Search app for first-time smartphone users, we're working to help people who struggle with reading by giving Google Lens the ability to read text out loud. #io19 pic.twitter.com/YMVbZa5XMQ — Google (@Google) May 7, 2019

Google says the new Lens functionality should start rolling out later this month.