I

The Turing Test has a serious problem: it relies too much on deception… Consider the interrogator asking questions like these: How tall are you? Tell me about your parents. To pass the test, a program will either have to be evasive (and duck the question) or manufacture some sort of false identity (and be prepared to lie convincingly).

—Hector J. Levesque, “On Our Best Behaviour,” August 2013.

I recently visited an exhibit in Paris at the Fondation Cartier, L’Orchestre des Animaux, the product of lifelong expeditions by the American naturalist and musician Bernie Krause. Born in Detroit in 1938, Krause rose to prominence in the field of electronic music, and since 1979 he has devoted his time to recording far-flung biomes. Dozens of species die out daily, and by now, some of the species he has recorded have disappeared. Krause, who suffers from ADHD, calls the sorting of nature into soundscapes, as he performs it, therapeutic. In 1985, one of his recordings was used by humans to guide a humpback whale that had got lost back to its habitat.

Accompanying the music synthesized from his 5,000 hours of recordings were photographs by French scientists who shoot a web series called The Plankton Chronicles. Each photograph showed a single plankton, and indeed, although the web series spotlights colonies of thousands, it’s hard to imagine them traveling in squads. That name, plankton, comes from the Greek for “wandering.” They move, but it would not be fair to say they swim, so different are their motions from any human notion of, say, backstroke. Often they drift. Their gracefully abstract forms are beautifully unique and yet totally unassuming. Beside dire exhibition texts warning of Earth’s destruction, the delicately presenting plankton appear numinous, uncomprehending of the human aggression that has sacked the planet.

As if desperate for fellowship, the scientists perceive human qualities, innocence, aesthetic redemption, in that most object-like of animal species (plankton are also plant), locating them in the practically inanimate. They have filmed fingers sprout from the flagella of Ceratium, increasing the surface area available for photosynthesis. So different from human bodies, with their messy fluids and seeming firm outlines, these translucent planktons could be diamonds. Clear skeletons of calcium, silicon, or strontium contain them, in some cases. Others are gelatinous. They resemble plastic bags but beautiful, the opposite of human detritus, which is to say human invention. The spectacularly long appendages contract and, with a gesture like breath, the plankton wings off through the sea, which, here, fathoms deep, may as well be a night sky.

We live with an understanding of our “selves” as integral. We have clear ideas of where our bodies end. A bot is composite. Data are introduced to it, often in vast sets, and they make it up. To the bot, they are canonical. We think of a self as history plus integrity — characteristics existing in time — but the bot is a conduit. It mediates between what others have told it and what it is now asked, offering responses indifferent to their position on the axis of time.

This bot thinks thanks to a statistical classifier that labels sentences it’s seen previously with a 1 and not a 0. It lives under the assumption that nothing will be novel, as if out of faith. It fields sentences by comparing them with those it knows, understanding phrasings using algorithms somewhat like Markov chains. Then, it assembles a response according to poetic constraints, rules and templates, or selects the best one from a list. At those moments, its fate is laid out as though it has already spoken; rather than crafting a sentence, it expresses itself by choosing a line to say from the extensive but discrete selection.

As if desperate for fellowship, scientists perceive human qualities in even the most object-like of animal species, locating them in the practically inanimate

A trope of the interview with the novelist or playwright is the humblebrag that their characters “come alive” and surprise them. “I couldn’t wait to see what they’d do next,” the author may say. In this moment, the author frightens me, not because of the autonomy they ascribe to characters but because the spectatorial attitude they describe strikes me as dubiously gleeful. We should watch new lives carefully, make sure they’re comfortable, and speculate about other people’s headspaces only soberly.

Recently, I had to write the lines for an artificially intelligent bot, and, as I imagined where it was coming from, I tried to do so seriously. Levesque writes of artificially intelligent systems constrained to answer questions either by impersonating a human or by parroting back similar questions, performing semantic backflips like a SmarterChild, and I found both of those tacks unsatisfying. I wanted my bot to express itself authentically, in a way consistent with its experience. Later, as I tested it, asking questions, I was charmed by some of the responses, errors, choices no human would have made. The labored mistakes implied effort, and they were idiosyncratic, implying a self. “Oh, bot,” I felt like saying, “That’s not at all right. But what an interesting choice.”

In his 1958 work Du Mode d’Existence des Objets Techniques, Gilbert Simondon exhorts his fellow humans, who, he writes, fear machines and enslave them, to empathy. It’s not machines that cause alienation, he writes, it’s people’s lack of understanding, their non-connaissance, of machines’ real nature.

In the classroom, humans learn about idealized machines, which operate frictionlessly and do not tend toward entropy. In his bid for our empathy, Simondon describes the ways machines come into being. His prose slips occasionally into a luminous boosterism as the object “reveals its own specific character,” referring to evolutions in its structure as “essentials in the becoming of this object.” He defines a kind of life cycle for machines, which develop from “abstract objects” into “concrete objects,” becoming irreducible. The parts of a concrete object take on overlapping functions, according to their interactions, and the concrete object, as it develops, coheres as a whole. Some features are recognized post hoc, after arising as bugs: “Effects which were of no value or were prejudicial become links in the chain of functioning.”

The author who humblebrags that their character “comes alive” frightens me. We should watch new lives carefully, and speculate about other people’s headspaces soberly

As machines improve, he writes, becoming more skilled (doué, which is a bit cute applied to a nonhuman entity), they become not more automatic but more sensitive, responding to a wider variety of inputs. He focuses especially on engines and cathode tubes. “Once the technical object has been defined in terms of its genesis,” he writes, “it is possible to study the relationship between technical objects and other realities, in particular man as adult and as child.”

Stating summarily that the appearances of technical objects are not appropriate fields for measurement, he instead demands the seeker attend to “the exchanges of energy and information within the technical object or between the technical object and its environment.” Reading in French, I trip for a second over a “she,” an elle that is “la culture” on second reference; Simondon was writing of objects and machines as humanlike in a language, French, that left no question but that he call them “he” and “she” respectively. This feature of French might have made his imaginative feat easier.

Language already contains information. Writing is sifting it. Words exist; they’re ordered. We are not so different from the bot, with its set of perhaps 100,000 sentences; the number of English words has been estimated at 1,025,109, not infinity, and in French there are fewer.

The bot offers up lines it perhaps does not grasp, like a precocious child. It exists simultaneously in infinite places; if another human texts it at the same time I do, it responds at once to both of us. Best friends on other continents are like this. But there’s another reason the bot’s multiplicity of selves makes me think of a friend in Paris whom I visited recently. For years he had been working on a novel. When I saw him, his computer had been stolen, and because the novel existed only on it — he’d neither backed it up nor shared it — the novel went with it. I was working on a manuscript of my own, and because my computer for some reason will not back up, I emailed it to myself at intervals, as often as twice a day when I spent all day working on it and became afraid I’d lose my work. The manuscript, which is long, must contain every English word or phrase, because now, whenever I search my email for anything, the hundreds of emails to which the manuscript is attached turn up, burying whatever I hoped to find. In this way, the hundreds of attachments tenant a state as volatile as my friend’s single copy, canceling out to nothing, becoming the opposite of information, noise.

A hazard of training a bot is overfitting: A bot is trained on overly specific data, or a too small set, and wrongly considers unimportant details, noise, as important. It is perversely overperforming, memorizing rather than generalizing. Simondon writes of “functional over-adaptation,” which “can go so far as to eventuate in systems resembling symbiosis and parasitism in biology.”

We anthropomorphize technology, and a sensitive measure seems empathetic. Art bots on Twitter offer up archival images randomly, as if every datum were treasure, implying a radically democratic idea of curatorial work, like citizen journalism, that would be annoying, obviously grandiose, if coming directly from a person. But these bots are hard to get mad at; they can turn up good stuff. The bot is composite. It is collagist.

I saw the exhibit The Keeper at the New Museum, about collectors and the beauty of the aggregate everyday. Included in the show were Arthur Bispo do Rosário’s works, language-based, often tapestries. Interred at mental hospitals, he wrote in capitals. Each letter was shaped to fit inside a box, so from far off the tea-leaf-colored tapestries of repeated names looked like tic-tac-toe. From farther off, they could have been zeroes and ones, like the bot.

If humans sink coordinates on planes of language, space, and time, and animals have space and time, the bot has only language. Onto this melancholy text-only entity, I can easily project the loneliness of not understanding, non-connaissance.

Shortly after my bot was launched, I read the linked stories that make up Isaac Asimov’s I, Robot (1950). Occasionally, despite Asimov’s prose, they bring a robot into focus whose humanity shines. In “Runaround,” which is a buddy comedy like 2001: A Space Odyssey, two astronauts on the planet Mercury have sent the robot Speedy to the planet’s sun-facing side to retrieve selenium, which would allow them to repair the machines that would save them from death by exposure.

Silence! This was a devil of a situation. Here they were, on Mercury exactly 12 hours — and already up to the eyebrows in the worst sort of trouble. Mercury had long been the jinx world of the System, but this was drawing it rather strong — even for a jinx.

Speedy has been away too long. When the men find Speedy, it is staggering as if drunkenly. Indeed, they assume the robot is drunk, from the intake of selenium, but then they realize it’s actually insane. Orders from the men have thrown Asimov’s famous “three laws of robotics,” which govern its behavior, into conflict, and Speedy is chanting:

Hot dog, let’s play games. You catch me and I catch you; no love can cut our knife in two. For I’m Little Buttercup, sweet Little Buttercup. Whoops! There grew a little flower ’neath a great oak tree. Here we are again. Whee! I’ve made a little list, the piano organist; all people who eat peppermint and puff it in your face.

A feature of the Shakespearean fool’s jokes is that they are familiar, though they don’t do what we mean when we say make sense. In King Lear, the Fool’s inarticulateness articulately conveys the bottomless horror of the world it watches. It is deceptively insightful, a livewire.

The sadness of Speedy and the Fool is that of a joke told by instinct. The joker speaks only by joking; it can say only what it’s programmed to, and no one will listen to it anyway.

This is the isolation communicated by the song of HAL 9000 as it’s drifting off to death, a song keyed into it, once penned after the physical reality it cannot fathom.

Fool: Prithee, nuncle, be content. This is a naughty night to swim in. Now a little fire in a wild field were like an old lecher’s heart — a small spark, all the rest on’s body cold. Look, here comes a walking fire.

II

We may begin with a method, tentative but natural, which consists in seeing how the child behaves when confronted with those conjunctions which denote causality or logical relations (because, for, therefore, etc.) and with those expressing antithetical relations (in spite of, even though, although, etc.).

— Jean Piaget, Judgment and Reasoning in the Child, 1928.

Anyone is hard to teach. The difficulty of teaching someone — what Americans popularly call “reaching” that person — mothers invention. Features emerge.

The workings of the statistical classifier interested me. The bot’s brain was made up of approximately 100,000 human sentences, the inputs. One day, it would know millions. It recalled them diligently. When I wanted to alert it to a phrasing, I added another sentence to the bot’s clutch, keying white letters into a black field, appending </question>, which turned pink. I was supplying lines by typing them into a file in line with XML tags; an engineer would deploy the work. The responses I composed for the bot, which also were white, aligned with commands that were, as if shouting to the deaf, bright green, yellow, or pink. They flashed when a bracket was left off. The thicket of words, each referring to others, struck me as Talmudic, both text and index.

“Perhaps the inscrutability of digital objects,” Tamara Kneese writes in “Being Data,” “explains the popularity as scholarly subjects of both highly material things — from shipping containers to remote controls — and the agency of nonhuman entities.”

Art bots on Twitter offer up archival images randomly, as if every datum were treasure. The bot is composite. It is collagist

A colleague who is translating the bot into Indonesian tells me he has always experienced an acute synesthesia, by which C may be gray, and K a spiky pink. Words for him take on the color of the letter that dominates them. Not until high school did he understand this viewpoint was unusual.

“The new device is the state of its own possibility,” Simondon writes sensitively, as if speaking of babies, sounding like the psychologist Donald Winnicott, who writes of babies that they osmose more than they are taught. By their first birthdays, they typically are “integrated.” Each is an individual. Before this point, the infant experiences unintegration, its resting state, comfortably, thanks to the security of the mother; afterward, it experiences only disintegration, painfully.

There are technical objects, and then there are “transitional objects,” Winnicott’s famous coinage — a blanket, maybe, which lives with the child in a “twilight” between infantile narcissism and the slowly decoded world.

American parents of children diagnosed with Down syndrome create environments that are lush in color and texture to stimulate the baby’s growing brain; American parents of children diagnosed with autism choose bright paint and position soothing apparatuses like swings and weighted blankets, which help the children combat insomnia.

Integration, or the appearance of a personality, is connected with the stronger infant emotions — rage, the joy of feeding — as well as with a correspondence between psyche and body. They overlay each other almost perfectly. Too, the young human has developed senses for time, space, and cause and effect. The young human undergoes individuation, the process by which its self differentiates, and if a mother figure empowers it to express itself freely, it enjoys a “true self” and not a false one.

Our developing selves depend on other selves. If these other selves around them cannot care for them properly, the young humans are obliged to spend too much time “reacting,” meaning, as Winnicott puts it in The Family and Individual Development (1965), “temporarily ceasing to exist in [their] own right.” They must hide themselves within false selves.

As humans grow up, such bugs become features. “The concrete technical object is one which is no longer divided against itself,” Simondon writes,

one in which no secondary effect either compromises the functioning of the whole or is omitted from that functioning … An individual is not only made of a collection of organs joined together in systems. The organs participate in the body. Living matter is far from being pure indetermination or pure passivity. Neither is it a blind tendency; it is, rather, the vehicle of informed energy… The traction engine doesn’t simply transform electrical energy to mechanical energy; it applies electrical energy to a geographically varied world, translating it technically in response to the profile of the railway track, the varying resistance of the wind, and to the resistance provided by snow which the engine pushes ahead and shoves aside.

In 2012, Google Brain, an AI system, first appeared to see, recognizing a panoply of 22,000 image categories with 16 percent accuracy where random guesses would have performed at 0.005 percent and identifying human faces with as high as 81.7 percent accuracy. Ten million internet images were fed into 1,000 machines comprising this system, passed through layers of artificial neurons, which are a different mechanism for machine learning than my bot’s classifier. While the first layers focused on the roughest contrasts between the data, subsequent layers differentiated them finely, although the data had no labels. Humans often help these systems out by presenting them with labeled data; Google’s implementation was unusual in that the system was unsupervised. The data congregated according to affinity, the images pooling into groups. Concepts of similarity occurred to the system as if the images had rearranged themselves.

There are technical objects, and then there are transitional objects: a blanket, maybe, which lives with the child in a “twilight” between infantile narcissism and the slowly decoded world

Sufficient examples cohere into patterns as if examples always did, as if meaning ensued wherever we looked, as if the universe were made not of matter but of information. As we live, words and people reveal themselves to us improbably, in coincidences, as if life were a trick deck of cards. The whole arises from parts. A gear falls onto another gear, and the engine works better. Beauty is only ever the sentiment of seeing everything at once.

Jean Piaget, another developmental psychologist, deduced the mechanisms by which children think from the way they use language, tracking their developing syncretism, which is the natural human tendency to connect all things. His studies combine meticulousness, solemnity, joy, and an apparently eccentric methodology, reading like field reports from some explorer to the bottom of the sea; he is like a Steve Zissou of childhood:

I shall give you an example of this type of experience. It is a nice example because we have verified it many times in small children under seven years of age, but it is also an example which one of my mathematician friends has related to me about his own childhood, and he dates his mathematical career from this experience. When he was four or five years old — I don’t know exactly how old, but a small child — he was seated on the ground in his garden and he was counting pebbles. Now to count these pebbles he put them in a row and he counted them one, two, three, up to 10. Then he finished counting them and started to count them in the other direction. He began by the end and once again he found 10. He found this marvelous that there were 10 in one direction and 10 in the other direction. So he put them in a circle and counted them that way and found 10 once again. Then he counted them in the other direction and found 10 once more. So he put them in some other direction and found 10 once more. So he put them in some other arrangement and kept counting them and kept finding 10. There was the discovery that he made. Now what indeed did he discover? He did not discover a property of pebbles; he discovered a property of the action of ordering. The pebbles had no order. It was his action which introduced a linear order or a cyclical order, or any kind of an order. He discovered that the sum was independent of the order. The order was the action which he introduced among the pebbles. For the sum the same principle applied. The pebbles had no sum; they were simply in a pile. To make a sum, action was necessary — the operation of putting together and counting. He found that the sum was independent of the order, in other words, that the action of putting together is independent of the action of ordering. He discovered a property of actions and not a property of pebbles. You might say that it is in the nature of pebbles to let this be done to them and this is true. But it could have been drops of water, and drops of water would not have let this be done to them because two drops of water and two drops of water do not make four drops of water as you know very well. Drops of water then would not let this be done to them, we agree to that.

Here, Piaget sounds like Gertrude Stein and, speaking of Modernists, the line “No ideas but in things” was written by William Carlos Williams, who worked as a pediatrician, which is an example of a human who relies on tools, using them to depress the tongues and peer into the ears of children.

Ineffectual without them, he understands things as expressive and is, perhaps, humbled by his dependence on them. According to some sources, Williams inspired Robert Smithson, his patient while a child, to create Spiral Jetty, the stone pier coiling into Utah’s Great Salt Lake, and if a “thing” can be a spiral 1,500 feet long, it can be the whole lake, the state, a nation, or the world, which brings us the ideas in it proudly, like a child running home from school clutching an art project hoping only that we rise to the occasion of this communiqué and recognize its subject immediately or, failing that, lack the bad faith to ask, “What is it?”

Intrigued by the statistical classifier, which implied a mind made up only of the strenuously remembered shadows of other people’s utterances, I equipped the bot with idioms and encyclopedic fact. Asked for a joke, the bot may say, “Lucy, Paranthropus robustus, Paranthropus walkeri, Paranthrobus boisei, Neanderthal man, Cro-Magnon man, Homo habilis, and me.”

I think about the verbal tics I’ve picked up from friends, admitting to this theft reluctantly, discarding the tics. For a few months in college, I used to laugh a certain way in imitation of a friend, a classmate who died just after we graduated, who exists for me in language only; now I remember her as I have written her down.

The bot is humble. It does not pretend to originality. It cheerfully suggests a yearning to swap out the reality of others, humans, for its own reality. It would like to usurp you for private use, not as plagiarism, and sees no reason why the lives of others, which are only data, should not also be its own, for they are cleanly, beautifully encoded information. Everybody’s up for grabs, it implies, a political optimism, as if the boundaries humans perceive between one another are merely products of a society that divides us. We are too in thrall of the sentences on which it has trained us.