Tesla isn’t necessarily a big data company, but it is the king of big data in the realm of autonomous vehicles.

At Tuesday’s EmTech Digital, Sterling Anderson, the director of Tesla’s Autopilot program, showed off the surprising strides Autopilot has made in the past 18 months. With 70,000 Autopilot-enabled vehicles on the road, Tesla gathers a million miles of driving data on people and autonomy every 10 hours. In all, Autopilot has guided some 100 million miles and gathered data on 780 million human driven miles.

That means that Tesla gathers the same amount of driving data every day as Google’s autonomous program has gathered since it started in 2009. The major caveat with that data? Quantity doesn’t equal quality.

Highway Data vs. City Data

Comparing Tesla’s data to Google’s data isn’t exactly a fair fight because Autopilot includes highway miles in its data total. Meanwhile, Google started its testing on freeways but “shifted focus to city streets, a much more complex environment than freeways.”

Tesla’s data is crucial for learning about driving at highway speeds, but highways are naturally easier for vehicles than city roads. Highways are literally built only for vehicles, whereas city roads are shared with pedestrians, cyclists, and people walking peacocks. All of those things contribute to a sometimes-complex driving environment.

Sterling Anderson explaining Tesla's miles Steve Jurvetson

Tesla gathers data on drivers when Autopilot isn’t turned on as well, but that is driver data, not driverless data. So the 780 million total miles doesn’t give as valuable of feedback.

While Tesla wins the numbers game, Google’s autonomous program is learning about complex problems that all of Tesla’s Autopilot data put together can’t solve.

Fully Autonomous vs. Semi-Autonomous

Autopilot is a semi-autonomous system. This works for today’s laws and regulations — government officials are far from enacting regulations. But semi-autonomy can only learn so much.

According to the National Highway Traffic Safety Administration standards, Google’s self-driving cars are the same as people when it comes liability. The system is much more advanced, therefore gathering much more advanced data.

At the EmTech conference, Chris Urmson, Google’s self-driving car director, described how advanced the system truly is. In Urmson’s example, Google’s autonomous car uses “semantic understanding” to know what is going on in the world and figuring out what to do about it. It understood that there wasn’t just an object in front of the vehicle, but there was a police officer getting out of a car.

Customers as Test Drivers

Google has no way to compete with Tesla in terms of number of cars on the road because Tesla released its project while still in beta. The test drivers are Tesla customers.

The ethics of doing this can be debated at length. No matter how many notices Tesla gives that a licensed, fully aware driver still needs to be behind the wheel, people are people. People take advantage of napping opportunities and show off Summon technology that isn’t ready to handle not having a person in the car.

Since Google doesn’t put everyday people behind the wheel of the autonomous technology it is testing, it can never match up to Tesla’s numbers.

Tesla has mobilized an impressive number of vehicles with advanced testing. It offers a chance for loads of data, but quantity doesn’t substitute quality.

If Tesla and Google could put aside competition and collaborate, however, the timeline for true autonomy would be a whole lot shorter.