From Quartz

To watch a self-driving car park itself seems like magic. Pull back the curtain, it’s a lot messier. Cars mistake snowflakes for obstacles, lose lane markings, and miss cars on the side of the road.

Engineers are racing to make cars perform better than humans, with the aim of saving millions of lives each year. Human error is to blame for 94% (pdf) of annual US traffic fatalities, according to the US National Center for Statistics and Analysis. Autonomous vehicles promise to prevent most of them. Even today’s off-the-shelf features, such as lane departure alerts (now widely available), could cut fatal crash rates by 86%, estimates the Insurance Institute for Highway Safety (IIHS).

That’s impressive, and machine learning continues to revolutionize what’s possible.

Uber and Alphabet’s Waymo are ferrying passengers in self-driving vehicles (with safety drivers) in cities from Pittsburgh, Pennsylvania, to Phoenix, Arizona. Yet the first fully self-driving cars may come first to retirement homes, corporate campuses and private communities: controlled environments where computers can easily map their world. ”I challenge any car company to drive through a complex urban environment without a diver under any weather conditions,’” says of Ryan Chin, co-founder and CEO of Optimus Ride, which reportedly has a dozen or so campuses and communities ready to pilot its self-driving technology. “We’re not there yet as an industry. Even the best systems aren’t there yet.”

Eight things that made autonomous cars go: ‘say what?!’

What fools today’s semi-autonomous cars? Raindrops and obstacles, and even masking tape and seagulls, all throw algorithms for a loop. Quartz assembled some of the most prominent challenges for self-driving cars below.

Altered stop signs

Computer science researchers subtly altered (pdf) stop signs to see if minor alterations could confuse self-driving cameras, even if a human driver might miss the change. Fake graffiti caused algorithms to misidentify the stop sign as a speed limit sign two-thirds of the time, while applying random tape, called an “abstract art sticker attack” by the researchers, resulted in the miscategorization 100% of the time.”

Falling snowflakes

Snowflakes and raindrops are notorious for scattering sensors’ signals. They can create the illusion that obstacles exist all the way around a vehicle. Algorithms are getting better at using lasers to paint a high-resolution 3D map of the environment to differentiate between H20 and solid objects, but winter remains one of self-driving cars’ biggest challenges. Snow blurs where computers perceived the road to start, and alter traction for tires. “In a lot of [cold and temperate] regions, it’s going to be a lot longer before we see autonomous vehicles than some people would like you to believe,” says Sam Abuelsamid of Navigant research. “You’re not going to have autonomous vehicles running around Toronto in the wintertime in 2020.”

Seagulls

Birds, too, can confound computers. In Boston, NuTonomy had to reprogram its cars to disperse stubborn seagulls. ”For the local breed of unflappable seagulls—which can stop autonomous cars by simply standing on the street, unbothered by NuTonomy’s quiet electric cars—engineers programmed the machines to creep forward slightly to startle the birds,” reports Bloomberg.

Foam

Researchers at the University of South Carolina disoriented a Tesla S by covering obstacles in sound-dampening foam so ultrasonic sensors did not detect them. Similarly, $40 worth of Arduino computers and an ultrasonic transducer (for generating sound waves) could trick a Tesla into avoiding a parking spot, or jam ultrasonic sensors to miss actual obstacles at close range.

Exiting vehicles

Cars orient themselves using other cars. That’s fine at higher speeds on the highway, but may lead to an unexpected swerve as cars begin to follow another car onto off-ramps. “When a car is traveling too slow to track lane lines, active lane-keeping systems use the vehicle in front as a guide,” IIHS states. “If the lead vehicle exits, the trailing car might, too.”

Hills

IIHS test drivers in the hills of Central Virginia found even advanced driver assistance systems could miss lane markings as vehicles crested hills. Without visibility ahead, cars swerved left and right to find the center of the lane, alarming drivers who were not warned to assume control of the vehicle.

Bridges

Bridges are a black-box for autonomous cars, reports Electronic Component News. Because bridges lack many of the environmental cues present on roads, they can prevent sensors from keeping the vehicle on track. The magazine compared it to “walking a straight line from one end to the other in a massive room, and the lights go out when you’re halfway across. While you don’t see anything, you have a general idea of the direction to continue, but are very susceptible to getting thrown off-course.”

Tree shadows

Tesla’s Model 3 made “unnecessary or overly cautious” braking maneuvers 12 times in 180 miles. Seven of those times were where trees cast shadows on the road, while the rest involved oncoming vehicles in another lane or crossing the road far ahead. “The braking events we observed didn’t create unsafe conditions because the decelerations were mild and short enough that the vehicle didn’t slow too much,” IIHS says. “However, unnecessary braking could pose crash risks in heavy traffic, especially if it’s more forceful. … Plus, drivers who feel that their car brakes erratically may choose not to use adaptive cruise control and would miss out on any safety benefit from the system.”