At last year’s CES NVIDIA’s CEO Jen-Hsun Huang unveiled the original Tegra X1-based Drive PX digital cockpit computer and discussed some of the difficulties associated with self-driving cars and the technologies that would be needed to overcome them. He discussed deep learning, deep neural nets, and the massive amounts of data that need to be collected and processed to ensure self-driving cars are not only safe, but instill confidence in the passengers and provide a good experience.











NVIDIA calls the Drive PX 2 the “world’s first in-car artificial intelligence supercomputer”. The NVIDIA Drive PX 2 features a pair of next-gen





The Liquid-Cooled NVIDIA Drive PX 2 The Liquid-Cooled NVIDIA Drive PX 2 Since then, Jen-Hsun claims that “several thousand man-years of effort have gone into developing the technology to enable self-driving cars” at NVIDIA . At the company’s press conference held at the Four Seasons Hotel just prior to the start of CES, he unveiled the upcoming NVIDIA Drive PX 2. We[ve got video of the event for you here, should you want to see the unveil and associated demos as they happened...NVIDIA calls the Drive PX 2 the “world’s first in-car artificial intelligence supercomputer”. The NVIDIA Drive PX 2 features a pair of next-gen Tegra processors and a pair of next-gen Pascal-based GPUs, for a total of 12 CPU cores and 8 TFLOPS or 24 DL TOPS (Deep-Learning Terra-OPS) of compute performance. If you’re wondering what DL TOPS are, they’re specialized instructions that accelerate the math used in deep learning network inference. And according to NVIDIA, the Drive PX 2 offers over 10 times more computational horsepower than the previous-generation product.











Dual Pascal GPUs On Removable Modules Dual Pascal GPUs On Removable Modules The Drive PX 2 needs all of this performance because it can process the inputs of 12 video cameras, plus lidar, radar and ultrasonic sensors, and all of the incoming data from those sensors needs to be processed in real time to provide 360-degree detection of lanes, vehicles, pedestrians, signs and other potential obstacles.











The Drive PX 2 module consumes upwards of 250 watts of power and the entire device is liquid cooled. Jen-Hsun claimed the liquid cooling was necessary for this version of the Drive PX 2 to ensure reliable operation in the wide variety of environmental conditions an automobile is likely to encounter. If the Drive PX 2 is being installed in a vehicle that already has liquid cooling, plumbing can be added to incorporate the Drive PX 2, otherwise a standalone version with an all-in-one solution is also available. The chips used in the Drive PX 2 are manufactured using a 16nm FinFET process – presumably at TSMC – but no technical details were revealed other than to say that the next-gen Tegras used featured 8 Cortex A57 cores and 4 Denver cores. Jen-Hsun did claim that the Drive PX 2 is like “having the power of 150 MacBook Pros in a device the size of a lunch box”, however. How did NVIDIA come up with that comparison? The Core i7 processors in the MacBook Pro are capable of 280 GLFOPs of compute performance, whereas the Drive PX 2 and DRIVENet are capable of roughly 42 TFLOPS, which is equivalent to 6 GeForce GTX Tital X cards. Divide 42 TFLOPs by 280 GFLOPS and you get 150.





NVIDIA ultimate goal is to create an end to end platform for deep learning and autonomous driving, that incorporates a deep learning platform for training, deep neural networks, and in-car AI. To that end, NVIDIA has developed and array of tools, in addition to the hardware in the Drive PX 2, to accelerate the company’s autonomous driving initiative, namely NVIDIA DRIVENet, DriveWorks, and DIGITS.











Dual Tegra Chips Visible On The Drive PX 2 PCB Dual Tegra Chips Visible On The Drive PX 2 PCB NVIDIA DRIVENet is a deep neural network consisting of 9 inception layers (which is essentially 9 deep neural networks nested within itself), 3 convolutional layers, and 37M neurons. It can handle 40B operations a second, and perform single and multi-class object detection and segmentation. NVIDIA has been developing the foundation of DRIVENet for a while now, but over the last few months (they showed a data set that spanned from July to December 2015) they’ve improved accuracy of object detection to 88%. The highest score recorded to this point just under 90%.











Object Detection In Action Object Detection In Action NVIDIA DriveWorks is a suite of software tools, libraries and modules used for the development and testing of autonomous vehicles. According to NVIDIA, DriveWorks “enables sensor calibration, acquisition of surround data, synchronization, recording and then processing streams of sensor data through a complex pipeline of algorithms” that run on the Drive PX 2.











The Volvo XC90 The Volvo XC90 NVIDIA DIGITS is a tool for developing, training and visualizing deep neural networks that can run on any NVIDIA GPU-based system, from standard desktop PCs and supercomputers to Amazon Web Services and the recently announced Facebook Big Sur Open Rack-compatible hardware. The models generated by the trained neural net then runs on the Drive PX 2 within the car.





This technology is still likely years away from mainstream adoption from automakers, but NVIDIA was able to announce its first partner already. Volvo Cars will use the NVIDIA Drive PX 2 in a fleet of 100 Volvo XC90 SUVs slated to hit the road next year as part of the Swedish automaker’s Drive Me autonomous-car pilot program.