*By the garbled reportage, I’d be guessing some of those kiwis were having trouble with my accent.

Here are the verbatim remarks.

This article has been reproduced in a new format and may be missing content or contain faulty links. Contact wiredlabs@wired.com to report an issue.

THE BRIEF BUT GLORIOUS LIFE OF WEB 2.0, AND WHAT COMES AFTER

Bruce Sterling, Wellington, Feb 2009

So, thanks for having me cross half the planet to be here.

So, just before I left Italy, I was reading an art book. About 1902, because we futurists do that. And it had this comment in it by Walter Pater that reminded me of your problems.

Walter Pater was a critic and an artist of Art Nouveau. There was a burst of Art Nouveau in Turin in 1902 — because what Arts and Crafts always needed was some rich industrialists. Rich factory owners were the guys who bought those elaborate handmade homes and the romantic paintings of the Lady of Shalott. Fantastic anti-industrial structures were financed by heavy industry.

I know that sounds ironic or even sarcastic, but it isn’t. Creative energies are liberated by oxymorons, by breakdowns in definitions. The Muse comes out when you look sidelong, over your shoulder.

So Walter Pater was a critic, like me, so of course he’s complaining. The Italians in 1902 don’t understand the original doctrines of the PreRaphaelites and Ruskin and William Morris! That’s his beef. The Italians just think that Art Nouveau has a lot of curvy lines in it, and it’s got something to do with nude women and vegetables! They’re just seizing on the superficial appearances! In Italy they call that stuff “Flower Style.”

And that’s your problem, too, here in New Zealand. Far from the action here at the antipodes, you people, you just don’t get it about the original principles of Web 2.0! Too often, you’ve got no architecture of participation, sometimes you don’t have an open API! Out here at the end of the earth, you think it’s all about drop shadows and the gradients and a tag cloud, and a startup name with a Capital R in the middle of it!

And that’s absolutely the way of the world… nothing any critic can do about it. People do make mistakes, they interpret things wrongly — but more to the point, they DELIBERATELY make mistakes in creative work.

Creative people don’t want to “do it right.” They want to share the excitement you had when you yourself didn’t know how to do it right. Creative people are unconsciously attracted by the parts that make no sense. And Web 2.0 was full of those.

I want you to know that I respect Web 2.0. I sincerely think it was a great success. Art Nouveau was not a success — it had basic concepts that were seriously wrongheaded. Whereas Web 2.0 had useful, sound ideas that were creatively vague.

It also had things in it that pretended to be ideas, but were not ideas at all: they were attitudes. In web critical thinking, this effort, Web 2.0, was where it was at. Web 2.0 has lost its novelty value now, but it’s not dead. It’s been realized: it has spread worldwide.

It’s Web 1.0 that is dead. Web 1.0 was comprehensively crushed by Web 2.0, Web 2.0 fell flaming on top of web 1.0 and smashed it to rubble.

Web 2.0 is Wikipedia, while web 1.0 is Britannica Online. “What? Is Britannica online? Why?”

Web 2.0 is FlickR, while web 1.0 is Ofoto. “Ofoto? I’ve never even heard of Ofoto.”

Web 2.0 is search engines and Web 1.0 is portals. “Yeah man, I really need a New Zealand portal! I don’t think I can handle that information superhighway without a local portal!”

What do we talk about when we say “Web 2.0?” Luckily, we have a canonical definition! Straight from the originator! Mr Tim O’Reilly! Publisher, theorist, organizer, California tech guru!

“Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications are those that make the most of the intrinsic advantages of that platform: delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an ‘architecture of participation,’ and going beyond the page metaphor of Web 1.0 to deliver rich user experiences.”

I got all interested when I heard friends discussing web 2.0, so I swiftly went and read that definition. After reading it a few times, I understood it, too. But — okay, is that even a sentence? A sentence is a verbal construction meant to express a complete thought. This congelation that Tim O’Reilly constructed, that is not a complete thought. It’s a network in permanent beta.

We might try to diagram that sentence. Luckily Tim did that for us already.

Here it is. (((Web 2.0 Meme Map.)))

The nifty-keen thing here is that Web 2.0 is a web. It’s a web of bubbles and squares. A glorious thing — but that is not a verbal argument. That’s like a Chinese restaurant menu. You can take one bubble from sector A, and two from sector B, and three from sector C, and you are Web 2.0. Feed yourself and your family!

Take away all the bubbles, and put some people there instead. Web 2.0 becomes a Tim O’Reilly conference. This guy is doing x, and that guy is doing y, and that woman is the maven of doing z.

Do these people want to talk to each other? Do they have anything to say and share? You bet they do. Through in some catering and scenery, and it’s very Webstock.

Web 2.0 theory is a web. It’s not philosophy, it’s not ideology like a political platform, it’s not even a set of esthetic tenets like an art movement. The diagram for Web 2.0 is a little model network. You can mash up all the bubbles to the other bubbles. They carry out subroutines on one another. You can flowchart it if you want. There’s a native genius here. I truly admire it.

This chart is five years old now, which is 35 years old in Internet years, but intellectually speaking, it’s still new in the world. It’s alarming how hard it is to say anything constructive about this from any previous cultural framework.

The things that are particularly stimulating and exciting about Web 2.0 are the bits that are just flat-out contradictions in terms. Those are my personal favorites, the utter violations of previous common sense: the frank oxymorons. Like “the web as platform.”

That’s the key Web 2.0 insight: “the web as a platform.”

Okay, “webs” are not “platforms.” I know you’re used to that idea after five years, but consider taking the word “web” out, and using the newer sexy term, “cloud.” “The cloud as platform.” That is insanely great. Right? You can’t build a “platform” on a “cloud!” That is a wildly mixed metaphor! A cloud is insubstantial, while a platform is a solid foundation! The platform falls through the cloud and is smashed to earth like a plummeting stock price!

Imagine that this was financial thinking — instead of web design thinking. We take a bunch of loans, we mash them together and turn them into a security. Now securities are secure, right? They are triple-A solid! So now we can build more loans on top of those securities. Ingenious! This means the price of credit trends to zero, so the user base expands radically, so everybody can have credit!

Nobody could have tried that before, because that sounds like a magic Ponzi scheme. But luckily, we have computers in banking now. That means Moore’s law is gonna save us! Instead of it being really obvious who owes what to whom, we can have a fluid, formless ownership structure that’s always in permanent beta. As long as we keep moving forward, adding attractive new features, the situation is booming!

Now, I wouldn’t want to claim that Web 2.0 is as frail as the financial system — the financial system that supported it and made it possible! But Web 2.0 is directly built on top of finance. Web 2.0 is supposed to be business. This isn’t a public utility or a public service, like the old model of an Information Superhighway established for the public good.

The Information Superhighway is long dead — it was killed by Web 1.0. And web 2.0 kills web 1.0.

Actually, you don’t simply kill those earlier paradigms. What you do is turn them into components, then make the components into platforms, then place more fresh components on top. That is native web logic.

The World Wide Web sits on top of a turtle, and then below that is an older turtle, and that sits on the older turtle. You don’t have to feel fretful about that situation — because it’s turtles all the way down.

Now, we don’t have to think about it in that particular way. The word “turtles” makes it sound absurd and scary, like a myth or a confidence trick. We can try another, very different metaphor — as Tim O’Reilly once offered us.

“Like many important concepts, Web 2.0 doesn’t have a hard boundary, but rather, a gravitational core. You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core.”

Okay, now we’ve got this kind of asteroid rubble of small pieces loosely joined. As a science fiction writer, I truly love that metaphor. That’s the web. Web pieces are held by laws of gravity, and supposedly the sun isn’t gonna do anything much. Right? The sun is four and half billion years old, it’s very old and stable. Although the web sure isn’t.

Let’s look at a few of these Web 2.0 principles and practices.

“Tagging not taxonomy.” Okay, I love folksonomy, but I don’t think it’s gone very far. There have been books written about how ambient searchability through folksonomy destroys the need for any solid taxonomy. Not really. The reality is that we don’t have a choice, because we have no conceivable taxonomy that can catalog the avalanche of stuff on the Web. We have no army of human clerks remotely able to tackle that work. We don’t even have permanent reference sites where we can put data so that we can taxonomize it.

“An attitude, not a technology.” Okay, attitudes are great, but they’re never permanent. Even technologies aren’t permanent, and an attitude about technology is a vogue. It’s a style. It’s certainly not a business. Nobody goes out and sells a kilo of attitude. What is attitude doing in there? Everything, of course. In Web 2.0 the attitude was everything.

Then there’s AJAX. Okay, I freakin’ love AJAX. Jesse James Garrett is a benefactor of mankind. I thank God for this man and his willingness to look sympathetically at users and the hell they experience. People use AJAX instead of evil static web pages, and people literally weep with joy.

But what is AJAX, exactly? It’s not an acronym. It doesn’t really stand for “Asynchronous Java and XTML.” XTML itself is an acronym — you can’t make an acronym out of an acronym! You peel that label off and AJAX is revealed as a whole web of stuff.

AJAX is standards-based presentation using XHTML and CSS.

AJAX is also dynamic display and interaction using the Document Object Model.

AJAX is also data interchange and manipulation using XML and XSLT;

AJASX is also asynchronous data retrieval using XML-http request.

With JavaScript binding everything.

Okay, that was AJAX, and every newbie idiot knows that Web 2.0 is made of AJAX. “AJAX with JavaScript binding everything.” JavaScript binding everything — like the law of gravity, like there’s a sun somewhere. Okay, that sounds reassuring, but suppose something goes wrong with the sun. Sun were the guys who built JavaScript, if you recall.

That sounds kind of alarming… because Sun’s JavaScript, the binder of AJAX, is the core of the Web 2.0 rich user experience.

JavaScript is the duct tape of the Web. Why? Because you can do anything with it. It’s not the steel girders of the web, it’s not the laws of physics of the web. Javascript is beloved of web hackers because it’s an ultimate kludge material that can stick anything to anything. It’s a cloud, a web, a highway, a platform and a floor wax. Guys with attitude use JavaScript.

There’s something truly glorious about this. Glorious, and clearly hazardous, bottom-up and make-do. I’m not gonna say that I will eat my own hat if the Internet doesn’t collapse by 1995. Guys say that — Metcalfe said it — he had to eat the damn hat. That doomsayer, man, he deserved it. He invented Ethernet, so what did he ever know about networking.

What I have to wonder is: how much of Javascript’s great power is based on an attitude that Javascript is up to the job? Duct-taping the turtles all the way down.

I certainly don’t want to give up Javascript — but is Sun the center of the web 2.0 solar system? Sun’s not lookin’ real great right now, is it? That is our solid platform, our foundation? Can you have Javascript without a sun? Duct-tape in the dark?

eBay reputations and Amazon reviews. “User as contributor.” Are “user” and “contributor” the right words for the people interacting with Amazon? Let’s suppose there’s a change of attitude within Amazon; they’re going broke, they’re desperate, the stock price has cratered, and they really have to turn the screws on their users and contributors. Then what happens? This is a social attitude kinda held together with Javascript and duct tape, isn’t it?

I mean, Amazon used to sell books. Right? You might want to talk to some publishers and booksellers about the nature of their own relationship with Amazon. They don’t use nice terms like “user and contributor.” They use terms like “collapse, crash, driven out of business.”

The publishing business is centuries old and bookstores have been around for millennia. Is Amazon gonna last that long? Are they a great force for our stability? Are we betting the farm on the Web 2.0 attitude of these guys?

Blogs — “participation not publishing.” Okay, I love my blog. Mostly because there’s never been any damn participation in it. My blog has outlived 94 percent of all blogs every created. I’ve got an ancient turtle of a blog.

I may also have one of the last blogs surviving in the future, because the rest were held together with duct tape and attitude. Try going around looking for a weblog now that is literally a log of some guy’s websurfing activities. Most things we call “blogs” are not “weblogs” any more.

Even MY ancient writer-style blog isn’t quite a weblog. My blog isn’t participatory, but it’s got embedded videos, FlickR photos, links to MP3s.

You can go read my blog from four years ago. Five years ago. Still sitting there in the server. Absolutely consumed with link-rot. I’m blogged to stuff that has vanished into the ether, it’s gone into 404land. It had “granular addressibility,” just like Tim recommends here, but those granules were blown away on the burning solar wind.

Not that I’m the Metcalfe prophet of doom here — there were more granules. Sure. I got supergranules. I get granules direct from Tim O’Reilly’s tweets now, I get 140-character granules. And man, those are some topnotch tweets. Tim O’Reilly is my favorite Twitter contact. He is truly the guru. I don’t know anybody who can touch him.

I also know that the Fail Whale is the best friend of everybody on Twitter. He’s not a frail little fail minnow, either. The Fail Whale is a big burly beast, he’s right up there with the dinosaurs.

Let me throw in a few more Web 2.0 oxymorons here because, as a novelist, these really excite me. “Web platform,” of course — that one really ranks with ‘wireless cable,’ there’s something sublime about it…

“Business revolution.” Web 2.0 was often described as a “business revolution.” Web 1.0 was also a business revolution — and it went down in flames with the Internet Bubble. That was when all the dotcom investors retreated to the rock-solid guaranteed stability of real-estate. Remember that?

Before the 1990s, nobody had any “business revolutions.” People in trade are supposed to be very into long-term contracts, a stable regulatory environment, risk management, and predictable returns to stockholders. Revolutions don’t advance those things. Revolutions annihilate those things. Is that “businesslike”? By whose standards?

“Dynamic content.” Okay, content is a stable substance that is put inside a container. It’s stored in there: that’s why you put it inside. If it is dynamically flowing through the container, that’s not a container. That is a pipe. I really like dynamic flowing pipes, but since they’re not containers, you can’t freakin’ label them!

“Collective intelligence.” Okay, there is definitely something important and powerful and significant and revolutionary here. Google’s got “collective intelligence.” I don’t think there’s a revolutionary in the world who doesn’t use Google. Everybody who bitches about Google uses Google.

I use Google all the time. I don’t believe Google is evil. I’m quite the fan of Sergey and Larry: they are like the coolest Stanford dropouts ever.

I just wonder what kind of rattletrap duct-taped mayhem is disguised under a smooth oxymoron like “collective intelligence.”

You got to call it something — and “collective intelligence” is surely a lot better than retreating to crazed superstition and calling it “the sacred daemon spirits of Mountain View who know everything.”

But if collective intelligence is an actual thing — as opposed to an off-the-wall metaphor — where is the there there? Google’s servers aren’t intelligent. Google’s algorithms aren’t intelligent. You can learn fantastic things off Wikipedia in a few moments, but Wikipedia is not a conscious, thinking structure. Wikipedia is not a science fiction hive mind.

Furthermore, the people whose granular bits of input are aggregated by Google are not a “collective.” They’re not a community. They never talk to each other. They’ve got basically zero influence on what Google chooses to do with their mouseclicks. What’s “collective” about that?

Talking about “collective intelligence” is like talking about “the invisible hand of the market.” Markets don’t have any real invisible hands. That is a metaphor. And “collective intelligence” doesn’t have any human will or any consciousness. “Collective intelligence” isn’t intelligently trying to make our lives better, it’s not an abstract force for good.

“Collective credit-card fraud intelligence” — that is collective intelligence, too. “Collective security-vulnerabilities intelligence” — that’s powerful, it’s incredibly fast, it’s not built by any one guy in particular, and it causes billions of dollars of commercial damage and endless hours of harassment and fear to computer users.

I really think it’s the original sin of geekdom, a kind of geek thought-crime, to think that just because you yourself can think algorithmically, and impose some of that on a machine, that this is “intelligence.” That is not intelligence. That is rules-based machine behavior. It’s code being executed. It’s a powerful thing, it’s a beautiful thing, but to call that “intelligence” is dehumanizing. You should stop that. It does not make you look high-tech, advanced, and cool. It makes you look delusionary.

There’s something sad and pathetic about it, like a lonely old woman whose only friends are her cats. “I had to leave my 14 million dollars to Fluffy because he loves me more than all those poor kids down at the hospital.”

This stuff we call “collective intelligence” has tremendous potential, but it’s not our friend — any more than the invisible hand of the narcotics market is our friend.

Markets look like your friend when they’re spreading prosperity your way. If they get some bug in their ear from their innate Black Swan instability, man, markets will starve you! The Invisible Hand of the market will jerk you around like a cat of nine tails.

So I’d definitely like some better term for “collective intelligence,” something a little less streamlined and metaphysical. Maybe something like “primeval meme ooze” or “semi-autonomous data propagation.” Even some Kevin Kelly style “neobiological out of control emergent architectures.” Because those weird new structures are here, they’re growing fast, we depend on them for mission-critical acts, and we’re not gonna get rid of them any more than we can get rid of termite mounds.

So, you know, whatever next? Web 2.0, five years old, and sounding pretty corny now. I loved Web 2.0 — I don’t want to be harsh or dismissive about it. Unlike some critics, I never thought it was “nonsense” or “just jargon.” There were critics who dismissed Tim’s solar system of ideas and attitudes there. I read those critics carefully, I thought hard about what they said. I really thought that they were philistines, and wrong-headed people. They were like guys who dismissed Cubism or Surrealism because “that isn’t really painting.”

Web 2.0 people were a nifty crowd. I used to meet, interview computer people… the older mainframe crowd, Bell Labs engineers and such. They were smarter than Web 2.0 people because they were a super-selected technical elite.

They were also boring bureaucrats and functionaries. All the sense of fun, the brio had been boiled out of them, and their users were hapless ignoramus creatures whom they despised.

The classic Bell subset telephone, you know, black plastic shell, sturdy rotary dial… For God’s sake don’t touch the components! That was their emblem. They were creatures of their era, they had the values of their era, that time is gone and we have the real 21st century on our hands. I am at peace with that. I’m not nostalgic. “Even nostalgia isn’t what it used to be.”

Web 2.0 guys: they’ve got their laptops with whimsical stickers, the tattoos, the startup T-shirts, the brainy-glasses — you can tell them from the general population at a glance. They’re a true creative subculture, not a counterculture exactly — but in their number, their relationship to the population, quite like the Arts and Crafts people from a hundred years ago.

Arts and Crafts people, they had a lot of bad ideas — much worse ideas than Tim O’Reilly’s ideas. It wouldn’t bother me any if Tim O’Reilly was Governor of California — he couldn’t be any weirder than that guy they’ve got already. Arts and Crafts people gave it their best shot, they were in earnest — but everything they thought they knew about reality was blown to pieces by the First World War.

After that misfortune, there were still plenty of creative people surviving. Futurists, Surrealists, Dadaists — and man, they all despised Arts and Crafts. Everything about Art Nouveau that was sexy and sensual and liberating and flower-like, man, that stank in their nostrils. They thought that Art Nouveau people were like moronic children.

So — what does tomorrow’s web look like? Well, the official version would be ubiquity. I’ve been seeing ubiquity theory for years now. I’m a notorious fan of this stuff. A zealot, even. I’m a snake-waving street-preacher about it. Finally the heavy operators are waking from their dogmatic slumbers; in the past eighteen months, 24 months, we’ve seen ubiquity initiatives from Nokia, Cisco, General Electric, IBM… Microsoft even, Jesus, Microsoft, the place where innovative ideas go to die.

But it’s too early for that to be the next stage of the web. We got nice cellphones, which are ubiquity in practice, we got GPS, geolocativity, but too much of the hardware just isn’t there yet. The batteries aren’t there, the bandwidth is not there, RFID does not work well at all, and there aren’t any ubiquity pure-play companies.

So I think what comes next is a web with big holes blown in it. A spiderweb in a storm. The turtles get knocked out from under it, the platform sinks through the cloud. A lot of the inherent contradictions of the web get revealed, the contradictions in the oxymorons smash into each other.

The web has to stop being a meringue frosting on the top of business, this make-do melange of mashups and abstraction layers.

Web 2.0 goes away. Its work is done. The thing I always loved best about Web 2.0 was its implicit expiration date. It really took guts to say that: well, we’ve got a bunch of cool initiatives here, and we know they’re not gonna last very long. It’s not Utopia, it’s not a New World Order, it’s just a brave attempt to sweep up the ashes of the burst Internet Bubble and build something big and fast with the small burnt-up bits that were loosely joined.

That showed more maturity than Web 1.0. It was visionary, it was inspiring, but there were fewer moon rockets flying out of its head.

“Gosh, we’re really sorry that we accidentally ruined the NASDAQ.” We’re Internet business people, but maybe we should spend less of our time stock-kiting. The Web’s a communications medium — how ’bout working on the computer interface, so that people can really communicate?

That effort was time well spent. Really.

A lot of issues that Web 1.0 was sweating blood about, they went away for good. The “digital divide,” for instance. Man, I hated that. All the planet’s poor kids had to have desktop machines. With fiber optic. Sure! You go to Bombay, Shanghai, Lagos even, you’re like “hey kid, how about this OLPC so you can level the playing field with the South Bronx and East Los Angeles?” And he’s like “Do I have to? I’ve already got three Nokias.” The teacher is slapping the cellphone out of his hand because he’s acing the tests by sneaking in SMS traffic.

“Half the planet has never made a phone call.” Boy, that’s a shame — especially when pirates in Somalia are making satellite calls off stolen supertankers. The poorest people in the world love cellphones. They’re spreading so fast they make PCs look like turtles.

Digital culture, I knew it well. It died — young, fast and pretty. It’s all about network culture now.

We’ve got a web built on top of a collapsed economy. THAT’s the black hole at the center of the solar system now. There’s gonna be a Transition Web. Your economic system collapses: Eastern Europe, Russia, the Transition Economy, that bracing experience is for everybody now. Except it’s not Communism transitioning toward capitalism. It’s the whole world into transition toward something we don’t even have proper words for.

The Web has always had an awkward relationship with business. Web 2.0 was a business model. The Transition Web is a culture model. If it’s gonna work, it’s got to replace things that we used to pay for with things that we just plain use.

In Web 2.0, if you were monetizable, it meant you got bought out by the majors. “We stole back our revolution and we sold ourselves to Yahoo.” Okay, that was embarrassing, but at least it meant you could scale up and go on.

In the Transition Web, if you’re monetizable, it means that you get attacked. You gotta squeeze a penny out of every pixel because the owners are broke. But if you do that to your users, they will vaporize, because they’re broke too, just like you; of course they’re gonna migrate to stuff that’s free.

After a while you have to wonder if it’s worth it — the money model, I mean. Is finance worth the cost of being involved with the finance? The web smashed stocks. Global banking blew up all over the planet all at once… Not a single country anywhere with a viable economic policy under globalization. Is there a message here?

Are there some non-financial structures that are less predatory and unstable than this radically out-of-kilter invisible hand? The invisible hand is gonna strangle us! Everybody’s got a hand out — how about offering people some visible hands?

Not every Internet address was a dotcom. In fact, dotcoms showed up pretty late in the day, and they were not exactly welcome. There were dot-orgs, dot edus, dot nets, dot govs, and dot localities.

Once upon a time there were lots of social enterprises that lived outside the market; social movements, political parties, mutual aid societies, philanthropies. Churches, criminal organizations — you’re bound to see plenty of both of those in a transition… Labor unions… not little ones, but big ones like Solidarity in Poland; dissident organizations, not hobby activists, big dissent, like Charter 77 in Czechoslovakia.

Armies, national guards. Rescue operations. Global non-governmental organizations. Davos Forums, Bilderberg guys.

Retired people. The old people can’t hold down jobs in the market. Man, there’s a lot of ’em. Billions. What are our old people supposed to do with themselves? Websurf, I’m thinking. They’re wise, they’re knowledgeable, they’re generous by nature; the 21st century is destined to be an old people’s century. Even the Chinese, Mexicans, Brazilians will be old. Can’t the web make some use of them, all that wisdom and talent, outside the market?

Market failures have blown holes in civil society. The Greenhouse Effect is a market failure. The American health system is a market failure — and most other people’s health systems don’t make much commercial sense. Education is a loss leader and the university thing is a mess.

Income disparities are insane. The banker aristocracy is in hysterical depression. Housing is in wreckage; the market has given us white-collar homeless and a million empty buildings.

The energy market is completely freakish. If you have no fossil fuels, you shiver in the dark. If you do have them, your economy is completely unstable, your government is corrupted and people kill you for oil.

The human trafficking situation is crazy. In globalization people just evaporate over borders. They emigrate illegally and grab whatever cash they can find. If you don’t export you go broke from trade imbalances. If you do export, you go broke because your trading partners can’t pay you…

Kinda hard to face up to all this, especially when it’s laid out in this very bald fashion.

But you know, I’m not scared by any of this. I regret the suffering, I know it’s big trouble — but it promises massive change and a massive change was inevitable. The way we ran the world was wrong.

I’ve never seen so much panic around me, but panic is the last thing on my mind. My mood is eager impatience. I want to see our best, most creative, best-intentioned people in world society directly attacking our worst problems. I’m bored with the deceit. I’m tired of obscurantism and cover-ups. I’m disgusted with cynical spin and the culture war for profit. I’m up to here with phony baloney market fundamentalism. I despise a prostituted society where we put a dollar sign in front of our eyes so we could run straight into the ditch.

The cure for panic is action. Coherent action is great; for a scatterbrained web society, that may be a bit much to ask. Well, any action is better than whining. We can do better.

I’m not gonna tell you what to do. I’m an artist, I’m not running for office and I don’t want any of your money. Just talk among yourselves. Grow up to the size of your challenges. Bang out some code, build some platforms you don’t have to duct-tape any more, make more opportunities than you can grab for your little selves, and let’s get after living real lives.

The future is unwritten. Thank you very much.