Schafer points out that we already kind of have a legal framework that could work for smart robots: the laws that apply to dogs. “Normally the answer is if you put something dangerous in the environment you are still responsible for it,” he says. People who own dogs are well aware that they cannot possibly control every action their dog takes. But at the same time, if someone has a dog that is dangerous, it is their responsibility to protect others from that dog. “As long as it was foreseeable for you that something you owned was going to cause harm, even if you couldn’t force the specific injury and harm, you’re responsible for it,” he says. And dogs are far more capable, creative, and intelligent than any computer system invented so far.

Even if the legal conversation is the same, our gut reactions to a crime involving a sophisticated robot shopper may be very different, when you compare it to an accident involving a drill or a ladder, for instance. Especially when the robot has been made to act and sound like a person. Weisskopf says that even though the Random Darknet Shopper was simply an algorithm run by a computer, visitors still wanted to turn it into a living entity. “It’s not an intelligent piece of software, absolutely not, it can’t learn, all these things that some software actually could do, it can’t. But it behaves like a human, and visitors would look at the collection of these 12 items and try to think of the personality of the shopper,” she says. “Which I think is very typically human to do that.”