The 'Singularity' of the nerds / Fringe group of computer programmers push toward a superhuman artificial intelligence

2004-01-11 04:00:00 PDT New Haven, Conn. -- Eliezer Yudkowsky gives me the Singularitarian handshake. "You take a person's hand and let go a billion years later," he says. It's Saturday night in the Yale University dorms, where dozens of transhumanists have gathered for a conference called Transvision. Even among people who look forward to life as technologically improved beings, the Singularitarians are a fringe group -- about 50 young computer-programmer types rushing toward their chosen milestone of a post-human age.

"The Singularity is the technological creation of smarter-than-human intelligence," says Yudkowsky. "All the progress we've seen over the past couple of centuries was gained using the same old-fashioned hunter-gatherer brain. Now I think that the technological creation of smarter-than-human intelligence looks possible, that we can do it if we work hard enough, that we will get ourselves killed doing it if we aren't careful, and that if it is not done by worried rationalists, it will eventually be done by someone else."

Some scientists project that the "Singularity" -- a kind of artificial intelligence -- will happen by 2100; some, within the next 25 years. The Singularitarians believe it can happen -- it must happen -- within a decade. Yudkowsky says there is "a 2 percent chance" that artificial intelligence can save the world from eventual social and ecological collapse, which is why his group's been called a cult and "the rapture of the nerds."

"The end of the world is a highly technical issue," he says. "We're working to save everybody, heal the planet, solve all the problems of the world." Artificial intelligence can be used to transform the human mind, he says, and free us from pain and stress or "a sterile round of endless physical pleasures." He anticipates endless growth for every human being, "becoming everything we've ever dreamed of being, not for a billion years, but forever." If any utopia is possible for the human species, he says, it lies in the Singularity.

Yudkowsky is 23 years old. He says he had a "pseudotraumatic childhood" and no formal schooling, but scored 1410 (high) and 1600 (the highest possible grade) on SAT tests at ages 11 and 15, respectively. Like most transhumanists, he is Caucasian. He's tall, with stooped shoulders, glasses, some brown teeth. He considers himself shy and socially awkward, is a "volunteer virgin" who doesn't drink, smoke or do drugs. He abhors pop culture. "Eating Pringles and watching football," he says, is "dystopian."

Raised in West Rogers Park, a neighborhood in Chicago that is popular with Orthodox Jews, he now lives in Atlanta. ("I don't care where I live, so long as there's a roof to keep the rain off my books, and high-speed Internet access.") In 1996, he set up the Singularity Institute for Artificial Intelligence with the help of money from donors involved in a dot-com startup. SIAI is a small group, for the moment more of a think-tank club than a crucible of the future.

"Strange as the Singularity may seem, there are times when it seems much more reasonable, far less arbitrary, than life as a human," says Yudkowsky. "We'll do it or die trying. I won't make any move unless I think I'm really, actually going to succeed, even after all human stupidity is taken into account. I intend to spend my life making it real."

And what will Yudkowsky's AI look like? "She is information," he says. "She'd look like whatever she wanted to look like on a computer screen. Probably a screensaver at first, followed by slowly more sophisticated personas." He believes a "friendly, self-improving, seed artificial intelligence" will be the "easiest, safest path to the Singularity," but admits that, since the Singularity will transcend human intelligence, his AI could easily maneuver around any safeguarding attempts. "Opposing a superintelligence is likely to prove futile."

For humans it will mark the end of Darwinian evolution; an end, as Yudkowsky puts it, to the need for billions of sentient beings to die in order to achieve the tiniest incremental design improvements. Yudkowsky looks forward to upgrading his brain by "adding neurons at the rate it currently loses them."

At age 200, he says, he'll add on new capacity to avoid becoming bored or suffering a serious cognitive malfunction. At age 2,000, he would "probably need serious architectural changes to the mind." He plans to be alive after the last star in the Milky Way is dead.