Pedestrians pose one of the biggest challenges for self-driving cars. Other cars mostly behave in predictable ways, but pedestrians can be more erratic. They can move in any direction, at any time, with little warning.

Fortunately, we humans are experts at understanding the behavior of other people. Human drivers can easily distinguish someone who is trying to cross the road in front of us from someone who is waiting for the bus or focused on his cell phone.

And that allows human beings to be smooth, confident drivers. If we pass a sidewalk full of pedestrians, we can tell at a glance if any of them are preparing to step in the road. If not, then we know it's safe to maintain our speed.

But encoding that kind of intuition into software isn't easy, which can lead to self-driving cars that are timid or erratic. A car that doesn't understand pedestrians will need to slow down every time it passes near one—since it can't rule out the possibility that the pedestrian might wander in front of the vehicle.

A new startup called Perceptive Automata is aiming to change that by using machine-learning techniques to give self-driving cars a rich understanding of pedestrians, cyclists, and other human beings around the car.

Human intuition in software

"We're building a module that allows autonomous vehicles to understand the state of mind of humans out on the road," says co-founder Sam Anthony. He told us that the software will "give autonomous vehicles the ability to look at a person and say, in a human-like way, 'This person wants to cross the road, this person knows that my car is there.'"

Ordinarily, machine-learning techniques train algorithms using data that can be measured objectively. But that's not really practical in a situation like this—it's not like the company can read the minds of pedestrians on the side of the road for its training data. Instead, Perceptive Automata relies on the subjective judgment of other human beings to provide the data used to train its algorithms.

The company asks human beings to watch video clips and then label the pedestrians in them, giving their best judgment about whether each pedestrian was trying to cross the street and whether he noticed the car. Perceptive Automata's engineers then use this dataset of labeled videos to train machine-learning algorithms to make the same kind of judgments.

The result, the company says, will be a software module that any self-driving carmaker can buy and drop into its existing autonomy stacks. Perceptive Automata argues self-driving carmakers should think of it as an additional sensor that can effectively read the minds of pedestrians around the vehicle. This "sensor" data can then be combined with data from cameras, lidars, and other hardware sensors to enable smoother, more confident driving.

An obvious question here is whether companies developing self-driving cars will really outsource this function to a third party like Perceptive Automata rather than developing the capability in-house. But Anthony told us that companies building self-driving cars have a lot on their plate and predicted that they'd welcome the ability to outsource this function.

"For someone who's doing the full stack, they're fighting so many fires just to get their testing fleet out there," Anthony said. There's an art to efficiently collecting data from human beings about other human beings, he said, and it's pretty far afield from the other tasks self-driving car makers are tackling.

Is it really necessary to model pedestrian thinking?

But self-driving car companies like Waymo and Uber are collecting millions of miles of sensor data from test driving. I asked Anthony whether they could use this sensor data to directly predict pedestrian behavior—simply by seeing which pedestrian behaviors tend to precede actually crossing the street.

But Anthony argues that it's much more difficult to predict whether a pedestrian will cross the street if you don't have a model of whether the pedestrian wants to cross the street. After all, a pedestrian might stand at the side of the road for several seconds waiting for the right moment to cross. What a self-driving car ultimately cares about is whether the pedestrian is trying to cross the street, not just whether he or she is going to do so in the very next second.

I didn't find this entirely persuasive. It seems like with enough data—and Waymo now has 8 million miles of data—statistically predicting which pedestrian attributes and behaviors are correlated with street-crossing behavior should be possible. Data about the pedestrians' state of mind would certainly be a "nice to have" feature, but I'm not convinced that it's a must-have feature.

Other researchers are working on methods to predict pedestrian movements based entirely on their observed behavior—without asking humans to annotate training data with pedestrians' state of mind. It's possible this kind of technique will work well enough to allow pedestrian behavior to be inferred based solely on the sensor data companies already have.

Still, Anthony says that Perceptive Automata has attracted significant interest. Nvidia has described Perceptive Automata as a partner. Current customers include carmakers and their tier 1 suppliers, as well as some startups, Anthony told Ars. The company hasn't yet publicly identified any of its customers, however.