This means that if even if you can't remember when a photo was taken, if you remember the content, you might still be able to find it with ease. Facebook gives the example of searching for "black shirt photo" and the system being able to "see" all of your photos with a black shirt, even if they aren't tagged as such. This also means that you'll be able to sort through photos your friends have shared to find what you're looking for rather than having to rely on tags and text descriptions.

Of course, for this to work Lumos has to be able to recognize objects when they appear in photos. Facebook developers used deep learning and a neural network to train the system to identify objects using tens of millions of photos with the proper annotations. By doing so, the image search can pick up on scenes, objects, animals, places, attractions and clothing items. The system also factors in some degree of diversity so the search results aren't a bunch of similar images. Not only is this handy with search, but it will also help Facebook describe images and video to users who are visually impaired.