Last year, HP announced it was building The Machine — a computer meant to leap as far above conventional modern systems as a high-end Xeon workstation is above an IBM mainframe from the 1960s. The entire system was designed to work with special-purpose cores and to use memristors as a universal memory architecture. The entire system would be tied together through extensive use of silicon photonics. It was bold, ambitious, and cutting-edge. And now, it’s pretty much dead.

That’s not the spin HP is putting on it, of course. Last week, HP announced that The Machine would convert to a “memory-driven architecture” which focused on storing lots of data rather than combining huge amounts of processing power. In and of itself, that’s still an incredibly useful system — memristors are supposed to dramatically improve power consumption over conventional DRAM architectures, and ramping up memory without breaking power budgets is a key challenge facing exascale computing.

The problem, unfortunately, is that Martin Fink, HP’s chief technology officer, also said that The Machine will be based on more conventional DRAM memory. Instead of a special-purpose OS (dubbed Linux++ last year and meant to help mimic memristor and photonic design of the platform in software), it’ll simply run a version of Linux. The problem, apparently, was memristors, which HP hasn’t found a way to produce in commercial volume or at a reasonable price.

“We way over-associated this with the memristor,” Mr. Fink said in an interview. “We’re doing what we can to keep it working within existing technology.”

A dream deferred

According to HP, The Slightly Less Incredible Machine it debuts next year will still offer up to 320TB of memory. The company is also focused on another slightly more realistic memory technology — phase change memory, or PCM. We’ve discussed PCM multiple times before and it has advantages over the memristor — specifically, it actually exists outside of a lab and can be purchased in some quantity. Unfortunately, HP was unable to offer much in the way of a plausible timeline for the adoption of this alternate approach, nor anything approaching the comprehensive vision it articulated for the original system.

HP’s decision to delay isn’t actually particularly surprising. The company dropped The Machine out of the blue last year, implying that technologies that have mostly occupied labs to-date were on the verge of becoming enterprise-qualified and shipping in volume. Compare that to the slow progression we’ve seen on everything from quantum computing to silicon photonics from other companies, and HP was basically arguing they had a collection of IP that would allow them to leapfrog the competition.

Such inflection points happen, to be sure, but they are often clearer in hindsight than they were at the time. The IBM PC might have been a revolutionary product for IBM, but it was scarcely the first desktop and its performance wasn’t anything particularly special. It’s quite rare for a company to deliver a comprehensive technology update that rockets ahead of anything anyone has done before, particularly when it leans on simultaneous advances in memory, optics, and SoC design.

The Machine as HP envisioned it, appears effectively moribund. The holy grail of unified memory and the memristors to power it will have to wait awhile longer. HP claims it may put the technology in printers, which some may remember as the place HP technology goes to die.