Owners of newer Tesla models are driving cars with imperfect code, risking accidents.

On a recent Saturday evening in South Carolina, Ward Mundy had his hands firmly on the steering wheel of his Tesla Model S when his wife let out a shout. The car was zig-zagging wildly across the road.

“It’s about the most dangerous ride you can imagine when you turn on Autopilot,” says Mundy, a lawyer and car enthusiast. He managed to get the Model S back under control, but it wasn’t the only time he’s had to rely on quick reflexes to avoid an accident. “You can be sailing along at 50 mph and the radar spots [an approaching] bridge and immediately slams on the brakes.”

Sometimes, the car doesn’t react when it should. “The other extreme is that you approach a stoplight with a car already stopped, and the Tesla doesn’t apply the brakes at all,” says Mundy.

Autopilot is the catch-all term for Tesla’s self-driving technologies. Its suite of hardware and software aims to match a car’s speed to traffic conditions, keep a car within or automatically change lanes (called Autosteer), park a car in a nearby spot, or allow it to be summoned from a garage. Elon Musk has said that by the end of this year, an Autopilot-enabled Tesla will drive from Los Angeles to New York without a human needing to touch the wheel at all.

To Mundy, that day seems a long way off. “It’s really a pretty scary experience,” he says. “If you’d ridden in the car with my wife, you would know how many times she’s screamed to turn it off.”

The Mundys’ recent Autopilot experiences are echoed by other Tesla owners in online forums and in YouTube videos of veering cars and near misses.Tesla has been building new Autopilot hardware and software into every car that’s rolled off the production line since November, rapidly rolling out self-driving capabilities before they’re fully tested. Even though many Tesla drivers seem willing to play along, the company’s strategy has some of them worried.

The stakes are high for Tesla as it gambles on this aggressive approach to testing. Owners who want to activate their cars’ Autopilot feature have to pay thousands of dollars extra. If drivers opt not to, the company loses out in its efforts to recoup its costs. But with a growing record of unexpected swerves, fish-tails and other miscalculations, Tesla is risking not only a hit to its largely sterling reputation, but also the lives and safety of some of its biggest fans.

Things looked a lot rosier last March, when Elon Musk got up on stage at Tesla’s Design Center in Hawthorne, California and promised a car with full self-driving technology for just $35,000 (although activating it would cost extra). Tesla’s Model 3 would be the company’s great leap forward: a stylish, speedy electric vehicle smart enough to let you read a book during your commute, or ride-share your robotic chauffeur to earn a few dollars.

The company’s semi-autonomous Autopilot, which relies on radars, cameras, and ultrasonic sensors, had been working well. And Musk’s decision not to use lidar, the powerful but expensive laser-ranging system found in most self-driving rivals, seemed to be paying off.

First introduced in October 2015, Autopilot was an instant hit with owners, who loved its ability to drive on highways with little human control. They were so enthusiastic, in fact, that Musk quickly promised to reign in Autopilot’s hands-free operation to “minimize people doing crazy things with it.”

Sadly, the restrictions did not work for Joshua Brown. In May 2016, the 40-year-old Model S owner activated Autopilot while driving on a highway in rural Florida. The car failed to spot a truck making a left turn in front of him and drove right under it, killing Brown outright. A reconstruction by the National Highway Traffic Safety Administration (NHTSA) indicates that the truck should have been visible to Brown for at least seven seconds before impact, suggesting he was paying no attention to the road ahead.