Many of the stories you can tell about Alan Turing—the mathematician who did so much for the Allies during the Second World War, and was so betrayed by his country afterward—are like folk tales crossed with science-fiction novels. (He broke German naval codes in an office, in Bletchley Park, called Hut 8, which sounds like a place a robot witch would live.) And some are just parables about gratitude. Last week, Google sponsored a garden party near London (the city before the riots), in Bletchley Park, to raise money to restore some of their proto-computing equipment. That is about as apt an act of corporate philanthropy as one can get; what is Google’s business but breaking the codes in Internet traffic, seeing the patterns, and guessing where we want to go next? It’s like connecting a torpedo to a U-Boat in the middle of an ocean, which, very simplified, is what Turing did. The company also helped what is now the Bletchley Park Trust get some of his papers back. (“I think a lot of our staff feel that if they had been around during the war they would have wanted to work at Bletchley Park,” a Google spokesman told the Telegraph.)

At Bletchley Park, Turing helped figure out how to build a machine that could make sense of codes generated by Enigma, the Germans’ intractable encryption device. He also came up with theories about how computers might work. And he devised the Turing Test: if a person, after watching a number of conversations (via text on a screen) between a computer and a person, couldn’t tell which was which, then the computer could be said to be thinking. What could betray it as a machine—a lack of kindness? It is humane to think so. But one might work Turing’s experience into the equation.

In 1952, Turing was arrested for homosexual activities. (Gays and lesbians have always served their countries in wars, after all.) Given the choice between jail and a severe course of hormone treatment (known as chemical castration), he took the latter, though not for long; he killed himself two years later. He was only forty-two: he had been just twenty-seven when the war began, in 1939. About a year before that, Disney’s “Snow White” had opened in theatres, and Turing had talked about how taken he was by it. That’s not so unusual, and maybe no one would have remembered the detail. But Turing was found with an apple in his hand, one that he had filled with cyanide himself and then bitten. (By coincidence, there are three new Snow White movies in the works; one doubts any will play the poisoned-apple scene as uncannily.)

Back to Google. If you type in “Alan Turing” and “Fairy Tale,” Google will lead you straight to that Snow White anecdote. But is there an algorithm that could tell the moral of that story? Here’s a thought experiment: what if, in 1952, but with better computers than they had then, someone running a Turing Test had asked his interlocutors if Turing was being treated the way a war hero (or, for that matter, any person) should be? A computer might be programmed to dodge, maybe by saying that it preferred to just let the law do its work. There are people who talk like that. (Though citing “gross indecency,” the name for Turing’s supposed crime, would involve some complicated rounds of of decoding.) Or the computer might cue up some other parable about apples. More interestingly, how would the person do? And how would the transcript have read, say, in 2009, the year Gordon Brown, then Prime Minister, apologized for what happened to Turing on behalf of the country, or twenty years from now? Turing had friends and admirers, and even those who didn’t know him, even back in 1952, might have been clearer than a computer about what actual decency was. But other people might have produced a stream of words that would look, to a future evaluator, like the absurdisms a computer cranks out when it has been led, by a questioners, to a point where its inhumanity was exposed. There are many ways to fail the Turing Test.

Turing in 1951. Photograph: National Portrait Gallery, Wikimedia Commons.