LAS VEGAS—Move over, Tegra K1, you’re already obsolete. At a Sunday night press conference, Nvidia CEO Jen-Hsun Huang launched a new mobile superchip, the Tegra X1. If Nvidia has its way, the X1 will be the graphics and artificial intelligence engine for the car of tomorrow.

Mark Hachman The Nvidia Tegra X1 is a "mobile superchip," according to Nvidia chief executive Jen-Hsun Huang.

The Tegra X1 includes a 256-core “Maxwell” CPU, the same architecture that Nvidia launched last February. Maxwell powers the GTX 980 and GTX 970 chips that the company launched this fall. But the new X1 also includes an 8-core, 64-bit Denver CPU. All told, the new X1 can process 4K video at 60 frames per second, using either the H.265 or VP9 video codecs.

At last year’s Consumer Electronics Show in January, Nvidia’s Huang debuted the Tegra K1, a mobile chip designed for tablets, cars, and other embedded applications. Then in August, Nvidia disclosed the performance of the 64-bit “Denver” derivative. According to Darrell Boggs, a chip architect for Nvidia, the “Denver” chip” and the 32-bit version of the Tegra K1 share the same 192-core “Kepler” graphics core that helps give the K1 its performance. But the 64-bit Denver includes chip optimizations that can push the number of instructions it can process per clock cycle to 7, versus just 3 for the 32-bit version.

Mark Hachman Nvidia shows off a smart dashboard concept during its CES 2015 speech.

But now we have the Tegra X1, which forms the foundation of Drive CX, a new automotive platform. And by pairing two Tegra X1 chips together, Nvidia plans to market cutting-edge technology called Nvidia PX, which combines real-time machine vision with deep neural learning to evaluate road signs, pedestrians, and other road hazards.

Why this matters: Nvidia is poised to make an even harder play for the car, a platform with millions of potential upgrade subjects, all waiting for better graphics and safety features that depend on intense processing power. Eventually, those cars will drive themselves, and Nvidia wants to be driver behind that virtual wheel.

Driving into the future

The Tegra X1 is the first teraflop mobile supercomputer, equivalent to the fastest supercomputer in the world circa 2000, boasting 10,000 Pentium Pros, Huang said.

“So the question is what are we going to do with all that horsepower? ... It turns out that the future car will have a massive amount of horsepower inside of it,” Huang said.

Mark Hachman Nvidia's Drive platform can apply textures to gauges. So if you're desperate for a bamboo speedometer, there you go.

Drive CX looks a lot like the car demonstration Nvidia showed off at the launch of the K1 last year, with dashboard gauges that featured real-time shadows and rendered paint and surfaces. Huang seemed especially proud of the tech's modeled surfaces—including bamboo and porcelain—without really considering how how quickly drivers look at gauges when they drive.

“End to end platform all the way from the processor to the software,” Huang said of Drive CX and the Nvidia Studio software that powers it.

A more impressive demonstration, however, delved into benchmarks: X1 apparently performs roughly twice as fast as Maxwell in some benchmarks. The X1 even ran DirectX demos at 10 watts of power consumption that the ostensibly more powerful AMD-based Xbox One runs at 100 watts.

Huang didn’t say anything about tablets like the Nvidia Shield, last year's showcase tablet for the K1. But one can assume that Nvidia will eventually build a next-generation tablet that includes the X1.

Huang also said the Drive platform could be used to intelligently improve driver-assist features, which currently include radar, ultrasonic, and computer-vision technologies. Increasingly, all three safety features are being replaced by camera-based technologies, which are getting better and better at detecting objects in low light. Eventually, chips like the X1 will become the foundation for self-driving cars, Huang said, complete with frequent software updates.

“We imagine all these camera around the car connected to a supercomputer inside the car,” Huang said.

Huang said the PX platform can detect and identify different kinds of objects, even different types of cars—including police cars. PX will also try to match objects—is that a pedestrian? is that a speed sign?—and compare them against a database, Huang said.

Mark Hachman The Nvidia Drive PX platform uses intelligent vision capabilities and deep learning to identify other objects on the road.

To accurately sense what’s around the car, however, a car must filter things out for itself, using a technology called deep learning. At present, Nvidia’s Drive PX architecture is only good enough to accurately detect about 80 percent of the objects it sees, according to the ImageNet Challenge benchmark. But Huang said Nvidia has tested the technology in the field, identifying speed-limit signs and even occluded pedestrians.

Huang went on and on at his CES keynote, stretching the presentation into at least two hours. But his point was clear: Nvidia wants to power the connected car, and it believes it has the architecture to work something close to miracles.

This article was updated to include video.