Google researchers have created an artificial intelligence system that measures “beauty and emotion” in images

NIMA judges photos against a 1-10 scale to determine how pleasing it would be to the human eye

The model could help filter similar photos and improve post-processing techniques

Beauty is in the eye of the beholder, or so the saying goes, and the same is often true when trying to pick out a perfect photography. Say you’ve got ten relatively similar shots of a loved one, family pet, or a stunning landscape – which one is the perfect shot and, crucially, why?

It’s a tough question to answer as there are multiple factors at play. It could be the shot which is the most competent, with no sign of any pesky blur or noise, but, on the other hand, it could also be the shot which catches the light in a way that makes it more appealing than the rest, even if it isn’t technically the best of the bunch.

Even if we’re not aware of it, the human brain tends to strike a balance between technical quality and aesthetic preference when judging photos. This means that even amateur photographers can pick out their favorite shot from a similar batch.

Editor's Pick Google releases two new experimental photography apps for Android Google There are over two billion devices out there and almost every one of them has a camera. Walking around with a fantastic camera in our pocket has become commonplace in the last couple …

But what if artificial intelligence could select the ‘best photo’ for us? A team of Google researchers have attempted to do just that with an AI model dubbed Neural Image Assessment (NIMA).

By now we’re all familiar with AI features baked into current smartphone camera suites which identify objects within each photo. NIMA goes one step further, using deep learning techniques to train a convolutional neural network (CNN) that can rate an image not just on its technical quality, but also how likely its overall aesthetic will appeal to the human eye.

Rather than categorizing an image as either high/low technical quality, NIMA uses a scoring system to rate the aesthetics of a photo on a scale of 1 to 10. Using this method, NIMA can examine each individual pixel for a technical assessment while also taking into account “semantic level characteristics associated with emotions and beauty in images.”

Amazingly, the system works too. In a paper outlining the project, Google’s researchers note that NIMA’s ratings closely matched those submitted by an average of 200 people for each image.

As for the AI’s practical applications, it’s not hard to imagine a feature on a phone – perhaps in a future update fo the Google Pixel 2 – which selects the best photo without the user having to trawl through endless near-duplicates. The researchers also suggest that NIMA could “enable improved picture-taking with real-time feedback to the user,” and even help post-processing techniques produce “perceptually superior results”.

What do you think of Google’s new system? Would you trust an AI to pick the right photo for you? Let us know in the comments.