Tesla’s vehicles are unique in a lot of ways – including the fact that as of October 2016, every new Tesla vehicle coming off the production line is equipped with what is basically a “supercomputer”.

Now the company’s new director of artificial intelligence compared the fleet to “a large, distributed, mobile data center.”

Last month, Tesla hired Andrej Karpathy, a renowned AI researcher, as the new head of AI and Autopilot vision.

The young scientist has made a name for himself in the machine learning/neural net community after research and teaching work at Stanford, Google, and Elon Musk’s OpenAI.

His work impressed Musk enough to move him from his non-profit AI research firm to his for-profit electric vehicle company as a director. Since joining Tesla, he made a few comments on Twitter about his work at the company.

After his first week at Tesla, he was fairly impressed by the pace:

First week @ Tesla turned into an intense (in a good way!) firehose. I forgot I own a Twitter account. — Andrej Karpathy (@karpathy) July 1, 2017

Later, he said that he was already testing a new Autopilot software build on a Model X:

Driving around PA with a Ludicrous mode Model X, testing a new Autopilot build. I see it will take a while before this gets old. — Andrej Karpathy (@karpathy) July 3, 2017

In another comment that he later removed, Karpathy made the comparison between Tesla’s fleet and “a large, distributed, mobile data center.”

“The Tesla fleet is like a large, distributed, mobile data center. Each machine is attached to a big battery, a person, and moves very fast.”

While data centers store data more than they gather it, we can understand where the comparison is coming from.

As we recently reported, Tesla significantly increased its data gathering effort through its fleet two months ago and started recording and uploading videos from the suite of cameras on the Model S and Model X vehicles with second generation Autopilot hardware.

The sensor suite consists of 8 cameras, 3 front-facing and 5 around the vehicle, 1 front-facing radar, ultrasonic sensors all around, advanced GPS, and a Nvidia Drive PX2 system. We recently got a rare look inside the computer.

That’s on what Tesla’s Autopilot software is running and from which they are gathering the data and sending it back to Tesla in order to feed their neural net. That’s also where Karpathy comes into play since it’s his expertise. He trains neural nets to process large amounts of data in order to perform specific tasks. In this case, recognizing images and taking driving decisions based on those images.

We have already seen several recent significant updates to Tesla’s second generation Autopilot, but it will be interesting to see what will come next now that Karpathy is involved. Though it could take some time to start seeing his impact, or the impact of the recent increase in data collection, make it to the fleet.

FTC: We use income earning auto affiliate links. More.

Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.