At first, companies may be able to write off the lost range or fuel. “It’s not a huge problem for the early applications, where we expect them to be used,” says Chris Urmson, who ran Google’s self-driving program and is now CEO of Aurora, a self-driving startup that has partnered with Volkswagen, Hyundai, and Chinese automaker Byton. That’s because the first robocars will likely be city-bound fleets of electric shuttles, moseying along at low speeds and able to recharge often.

Buyers of regular cars aren’t likely to be impressed though. Maybe you’re old enough to have dealt with a parent who turned off the car's AC to save gas. Now imagine having to turn off the self-driving abilities just to make it to your destination without running out of electrons.

The good news is that the folks who make the chips buried in the car’s computers are on the case. At CES last month, Nvidia put the spotlight on a new processor designed specifically for autonomous vehicles, called Xavier. It has an eight-core CPU and 512-core GPU, a deep learning accelerator, computer vision accelerators, and 8K video processors. The company says it's the most complex system on a chip ever created. “We’re bringing supercomputing from the data center into the car,” says the company’s man in charge of automotive, Danny Shapiro. But what’s key is that Xavier does more work for less power. “We’re able to deliver 30 trillion operations per second, all on a single SOC, or system on chip, that consumes 30 watts of energy.”

Even that’s not good enough for full self-driving vehicles. Nvidia believes that a fully self-sufficient, no-steering-wheel-or-pedals kind of driverless car will need to run on a platform it’s calling Pegasus. With two Xavier chips and two more GPUs, this platform can crunch 320 trillion operations per second and keep power consumption to an acceptable 500 watts.

Nvidia's competitors are chasing the same goals. Intel is developing low-power chips optimized for self-driving cars, Tesla is building its own chip for Autopilot, and Qualcomm is building the communications hardware they'll need—all with low power and efficiency in mind.

Specialized automotive chips help with other practical problems. Open the trunk on one of the self-driving prototypes running around Phoenix or San Francisco and you’ll likely see racks of computer equipment. Some of that is for testing and development—designers want to capture and record every moment the car’s in motion—and a consumer-facing version will require less hardware. But having somewhere to throw your groceries isn’t a negotiable for regular car buyers.

If you’ve had your laptop burn your legs, you know computers also get hot when they work hard. That heat is wasted energy, and it’s also not something you want in your car on a hot day. Some robocar prototypes need water-cooling with hoses and radiators, which eat up even more space. So now, the race is on to compact all that prototype equipment down to something the size of a laptop and tuck it away behind the glovebox, where it can be reached for upgrades but mostly ignored. New chips, with their lower power requirements, help here too: They generate less heat, so can get away with a small fan for cooling, and smaller packaging.

The car industry has a role model in the consumer electronics business, where devices get ever smaller and more capable. “All of our customers are always saying more performance, lower power—we need to do that across all markets,” says John Ronco, VP of product marketing at ARM, which designs the basic architecture of chips you find in most modern smartphones, as well as Nvidia's self-driving chips.

It’s the age-old refrain—more for less—but it’s crucial if you want your first roboride to be in something a little more comfortable than a lurching blue panel van.

Real World, Real Problems