The most prominent theme at this last weeks Intel Developer Forum was Intel's push into the consumer electronics space, a push that starts in earnest at 32nm with products aimed at televisions, set-top boxes, and mobiles. In fact, a whole day of the conference was dedicated to Intel's thinking on "convergence," a trend that's up there with "virtual reality" and micropayments in the pantheon of Next Big Things that never actually materialized. Intel is, of course, well aware that "convergence" is a punchline, so the company has taken pains to specify that they're talking about a "new convergence," despite the fact that it's still essentially about hooking a PC up to your TV.

What's supposed to be different this time is that convergence makes TV personal... and it also makes it social. So the converged TV experience will be both individualized and collective, get it? Ok, let's try again—the converged TV of the future will be so highly personalized that it will show you content and advertising that's precisely tailored to your interests (gleaned via phone-based GPS tracking and your Google history), while also somehow bringing the television experience back to an bygone, black-and-white era where we all felt connected by virtue of the fact that everyone watched one of three networks.

If you think that these two notions of TV as hyper-personalized, on the one hand, and hyper-social, on the other, are "in tension" with one another (to use the popular grad school euphemism), you're not alone. The whole time I was watching Justin Rattner's convergence keynote, I felt like I almost got it, but not really. I even tried to score an interview with one of Intel's ethnographers so that he could explain it to me, but apparently the whole company is sleeping off a post-IDF hangover because I'm still waiting for someone to get back to me on this.

And the ethnographers should be able to answer my questions, because they're the ones whose job it is to come up with these new notions of what TV should be once TVs are powered by x86 processors. They're the guys who are tasked with dreaming up problems for which x86 is the solution.

The love child of Skynet and Clippy

Most of what Intel's ethnographers have cooked up in the way of TV-related problems for Intel to solve tend to lie toward the "personal" end of the spectrum. Specifically, Intel hopes that an x86-powered television will be able to zero in on which parts of the billions of hours of networked video that will soon be available that you'll want to see, and then automatically populate a playlist with it.

The demo of this automatic playlist capability involved Rattner discovering that his video playlist had been automatically populated with guitar-related content, because he had taken his GPS-equipped MID to Guitar Center over the weekend to look for a guitar for his son. But this example, as technically impressive (and mildly creepy) as it was, actually demonstrates a common problem with automatic recommendation schemes, because it wasn't clear that Rattner himself actually cares anything about guitars. The computer saw that he had gone to Guitar Center, and apparently assumed that he was shopping for himself. In my own life, this same problem crops up with Amazon's recommendation engine, which, because my wife also buys books on my Amazon account, kept showing me books about breast feeding for months after my daughter was born.

My point here is that computers are still notoriously bad at anticipating what we want, and they probably always will be, no matter how much Intel they have inside. This isn't a problem that Intel can solve—I still have no idea what my wife wants half the time, and I'm a human who lives with her. This being the case, what Intel is likely to end up with before they're forced to scale back their ambitions, is something like a cross between Skynet and Clippy, the Microsoft Office paper clip—all-seeing, ever-present, and all-annoying.

The parts of Rattner's demo that held considerable promise were the parts that put humans in charge of all that processor horsepower and enabled them to drill down through video content in order to find exactly the clips they want. For instance, Rattner demonstrated a technology that identifies the players in a soccer game, tracks the commentary feed to pick up highlights, and stores all of this metadata for user browsing. The end result is that a soccer fan can quickly put together her own highlight reel from last night's game, with a special focus on her favorite player.

"So this is like my own private SportsCenter," said Rattner. "I decide what I want to see."

That's convergence I can believe in, and if Intel can deliver more of those kinds of experiences to TV watchers, they'll have no problem finding an audience.