Microsoft on Wednesday released Seeing AI, an iPhone app that attempts to analyze its surroundings and describe them audibly for people with impaired vision.

Using neural network technology, the app can not only translate text but recognize people and currency, scan product barcodes, and offer a simplified description of an entire scene or imported image. In cases like barcodes and text recognition, audio cues guide users towards getting a solid lock. Some basic functions will work without an internet connection.

In analyzing people the app will not only try to name them if possible, but share details like estimated age, how far away they're sitting, and their emotional state.

Microsoft has been working on the app since Sept. 2016, and first demonstrated a prototype this March.