Google's Deep Dream software proved that computer imagination can be strange and hallucinogenic. But, given the right parameters, it can also be profoundly dirty. Just look at the AI-generated pictures above — the top row of images all look fairly innocent (they're supposed to be towers); but the bottom row, well, has an unmistakeable penis-y feel to it. That's right: artificial intelligence has learned how to hallucinate genitals.

This imagery is the work of computer scientist Gabriel Goh, who created a neural network that mashes together two existing programs. The first is a Deep Dream-like image generator from MIT that uses deep learning to look at libraries of pictures and create similar images, and the second is an open-source program from Yahoo that automatically detects and filters pornography.

Goh essentially plugged these two neural nets together, creating a program that can generate random images with an adjustable amount of NSFW-ness. The neural net scores images between 0 and 1, with 0 being completely safe-for-work and 1 being definitely pornographic. (For a full description check out Goh's GitHub page.) Here's what a low score looks like:

And here's a high score:

The most interesting application of Goh's work is when he uses his program to generate NSFW imagery based on scenes already identified by MIT's neural net. Goh asks his program to create pictures of known imagery — like beaches, concerts, canyons, etc. — but then ramps up the NSFW factor so the network exaggerates the patterns shapes and colors that could be mistaken for male or female genitalia.

This means in an art gallery and during a concert we get lots of pictures of dicks:

While desert scenery seems to skew more towards female genitalia:

And when we take a trip to the beach, we get a mixture of both!

As Goh writes: "The images generated range from the garishly explicit to the subtle. But the subtle images are the most fascinating as to my surprise they are only seemingly innocent [...] The NSFW elements are all present, just hidden in plain sight." Indeed, a lot of these images look like the work of a surrealist painter trying to hide as a many penis and labia-like shapes in their landscapes as possible.

It also, though, shows the frailties of many deep learning systems. After all, these computers don't know what constitutes NSFW imagery in the same way a human does — they've just been taught to look for certain patterns by absorbing vast amounts of pornographic (and non-pornographic) imagery. Running these processes backwards shows just how skin-deep this knowledge is.