Take a webcrawler, set it loose on the Internet, and tell it to scrape 10 gigs of images and video at random. What kind of images do you get back? As a German coder Samim Winiger found, "a good chunk" of it will be porn and graphic violence. Not that he was necessarily surprised. "Look up how much of the Internet is adult content," says Winiger in an IM interview. "It's staggering."

Winiger wasn't looking for cheap thrills. He was collecting data for "Sensual Machines," a multi-part experiment the Berlin-based researcher is running to investigate what happens we teach machines to see.

Winiger's model could indentify a thousand class of objects, but it had never seen porn. "If you feed it porn, it goes, 'Hey, this is a bikini,' or 'Hey, this is a Scottish deerhound.'"

Winiger first took his collection of images and ran them through RCNN, a fairly sophisticated system, that can scan a photo and identify individual objects. But there was an interesting quirk. "The model I'm using in the experiment is state-of-the-art," says Winiger. It can recognize a thousand class of objects, everything from a bouquet of flowers to a man on a bicycle. But it had never seen porn. "If you feed it porn, it goes, 'Hey, this is a bikini,' or 'Hey, this is a Scottish deerhound.'"

A example of RCNN in action.

Using RCNN in combination Project Oxford, which attempts to classify based on gender and age, he turned the system loose on all that filth. John Cleese in a bikini got analyzed:

Samim Winiger

As did a scene from from an adult film:

Samim Winiger

Winiger also used a technique called semantic segmentation on his collection of adult photos, which isolates individual objects within a picture, creating what he called a "Computationally Generated Orgy."

Finally, Winiger used a program called the Deep-Visualization-Toolbox, which visualizes what a computer does when it attempts to make sense of an image. These pictures are, quite literally, what a computer sees when it looks at pornography:

For Winiger, this isn't really about treating an algorithm to some high-tech titillation. As he explains it, "Sensual Machines" forces humans to see their own actions and intentions in the programs they create. "There's a growing tendency at corporations doing machine learning stuff to hide behind the blackbox mentality," he said. "They say, 'Hey, its the machine that made the decision.' This is only true in a very limited sense. There is human intervention at every step, its just mediated."

Another image Weniger analyzed reveals the darker ramifications of what he means:

This content is imported from Third party. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

"If a drone kills a bunch of folks automatically and wrongly based on a AI based system, who is to blame?" asks Winiger. "The AI model? The system? The code? The operator?"

Or, he says, soon people will use the tools of machine learning to create child porn. "Who is legally liable here? The computer model? The person who wrote the model? The training data? The person who watched the porn? It's very much an open question."

As the field of machine learning and computer visions systems exponentially advance, Winiger wants to force creators to remember humans are still the ones keying in the code. "What AI is really about is a further collaboration between machine and man, not the stupidly naive notion of the singularity." As machines grow ever smarter, Winiger wants us to remember that they're tools—tools that humans are wielding. "You can use a hammer to kill people," he says. "Or make houses."

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io