The company’s name, Aeva, is a play on “Eve,” the name of the robot in the Pixar movie “WALL-E.”

The market for autonomous vehicles will grow to $42 billion by 2025, according to research by the Boston Consulting Group. But for that to happen, the vehicles will need new and more powerful sensors. Today’s autonomous cars are ill prepared for high-speed driving, bad weather and other common situations.

The recent improvements in self-driving cars coincided with the improvements offered by new lidar sensors from a Silicon Valley company called Velodyne. These sensors gave cars a way of measuring distances to nearby vehicles, pedestrians and other objects. They also provided Google and other companies with a way of mapping urban roadways in three dimensions, so that cars will know exactly where they are at any given moment — something GPS cannot always provide.

But these lidar sensors have additional shortcomings. They can gather information only about objects that are relatively close to them, which limits how fast the cars can travel. Their measurements aren’t always detailed enough to distinguish one object from another. And when multiple driverless cars are close together, their signals can become garbled.

Other devices can pick up some of slack. Cameras are a better way of identifying pedestrians and street signs, for example, and radar works over longer distances. That’s why today’s self-driving cars track their surroundings through so many different sensors. But despite this wide array of hardware — which can cost hundreds of thousands of dollars per vehicle — even the best autonomous vehicles still have trouble in so many situations that humans can navigate with ease.