The self-driving-car crashes that usually make the news are, unsurprisingly, either big and smashy or new and curious. The Apple that got bumped while merging into traffic. The Waymo van that got T-boned. And of course, the Uber that hit and killed a woman crossing the street in Tempe, Arizona, in March.

Look at every robocar crash report filed in California, though, and you get a more mundane picture—but one that reveals a striking pattern. In September of this year, for example, three self-driving cars were sideswiped. Another three were rear-ended—one of them by a bicycle. And that’s not even the strangest one: In June, an AV operated by General Motors’ self-driving arm, Cruise, got bumped in the back—by a human driving another Cruise.

The people developing self-driving cars pitch them as a tool for drastically reducing the nearly 40,000 fatalities that hit US roads every year. Getting there will take years at least, decades probably, and that means a lot more time spent testing on public roads. And so these sorts crashes raise a few questions: What’s the best way to handle what could become a nationwide experiment in robotics and AI, where the public participants haven’t willingly signed on and the worst-case scenario is death?

We don’t have the answers. But chipping away at these questions starts with understanding the problem. And that means looking at the data.

Unfortunately, the publicly available data is quite limited. These are companies in a competitive field, and they don’t voluntarily share much in the way of details. They invite the press or public officials into their vehicles only in tightly controlled situations where they perform well. And anecdotal evidence of weaknesses—like The Information’s report that Waymo cars have trouble with left turns into traffic and frustrate human drivers—is well, anecdotal.

Of the states where most AV developers do their on-road testing—Arizona, California, Michigan, Nevada, and Pennsylvania—only the Golden State requires companies to report details about their programs. Once a year, they must submit a report to the DMV explaining how many miles they’ve driven and how often the human sitting in the car took the wheel. Anytime one of their cars is in any sort of collision, no matter how minor, the developer must submit a collision report within 10 business days explaining what happened.

Since these regulations took effect in 2014, the California DMV has received (and published) 104 autonomous-vehicle collision reports, including 49 so far in 2018, as more and more cars hit the streets. Most crashes are minor; few are newsworthy. But taken together, they present a picture of how these tests are progressing and how well robots are sharing the road. And they hint at a conclusion similar to what anecdotal evidence suggests—that these vehicles drive in ways humans might not expect, and might not want them to.

As this chart shows, GM’s Cruise has filed by far the most reports in 2018, but don’t read too much into that. If the pattern holds from 2016 to 2017 (we won’t have full 2018 numbers until early next year), Waymo has been dialing down its testing in California in favor of Arizona. Cruise has been ramping it up and does its driving in the chaos of San Francisco. Waymo has the second-most collisions, followed by Zoox, a startup that also tests in the city.