Over the weekend, the story broke that a Missouri lawyer, Joshua Neally, claimed that the Autopilot system in his Tesla Model X saved his life when he suffered a pulmonary embolism while behind the wheel. Neally contends his Model X drove him to the hospital, saving his life.

Some, including KY3, the Missouri news station to originally air the report, are calling it the "counter-example to that fatal Florida crash." That's because Autopilot saved a life, rather than took one.

This, as it turns out, is the exact opposite conclusion that should be taken away from this story.

Inspecting the account of what transpired without wearing rose-colored glasses, the story looks a lot less remarkable. In fact, it quickly becomes clear that it's an account of yet another Tesla owner misusing and over-utilizing Autopilot.

Unable to drive

Before we dig into why, let's quickly recap what Neally says occurred.

According to KY3, while driving down Missouri highway 65 last week, Neally was suddenly overcome by the "most excruciating pain" he'd ever felt. Later, he would learn he was suffering a pulmonary embolism, an arterial blockage in the lungs.

"I couldn't breathe, I was gasping, kind of hyperventilating," Neally told KY3. The pain was so severe that, by his own admission, Neally was unable to drive.

Instead of pulling over to wait for an ambulance to take him to the hospital, Neally rerouted his Model X to a nearby hospital. With Autopilot engaged, his Tesla reportedly drove to within a few blocks of the hospital. Then, Neally manually drove the all-electric luxury SUV the rest of the way to the emergency room.

Neally was "totally distracted from driving."

Admittedly, this sounds miraculous at first blush. The more you think about it, though, it becomes less amazing. Let's start with Neally's decision not to pull to the side of the highway wait for an ambulance.

In his area of Missouri, over 85 percent of ambulances respond within nine minutes. Likely, Neally wasn't thinking rationally. As he told KY3, he "knew" he had to get to the ER.

According to Dr. Howard Liebman, medical professor and hematologist at the Keck School of Medicine at the University of Souther California, Neally likely had time to safely pull to the side of the road and wait for an ambulance to take him to the ER.

"Ten, 15 or even 20 minutes later ... I am not sure would have made a difference," Liebman said. Waiting hours — not minutes — would have risked Neally's life, Liebman points out.

Furthermore, Liebman asserts that if Neally was conscious enough to engage Autopilot and enter the hospital in the navigation and then later drive onto hospital property, Neally would have been conscious enough to pull to the side of the road and call for an ambulance.

Despite that, Neally chose to rely on Autopilot to do the driving. And it's that decision that once again underscores the crux of the Autopilot problem. Concerns surrounding Autopilot's misleadingly robust functions were first raised after the fatal Autopilot crash in Florida in May, since the driver may have been watching a Harry Potter movie rather than paying attention to the road.

Like in the fatal Florida crash, Autopilot was being relied upon too heavily.

Let's not mince words: Neally was driving impaired. The decision whether to continue on to the ER in a debilitated state or to rely on his Tesla to get him to the hospital matters. It matters because Autopilot isn't a driverless system, despite it being marketed — and viewed by the public — like one.

Although it feels pretty darn self-driving, it's not. And Tesla tries to make that clear before the system can even be activated. Before a driver can engage Autopilot, he or she must click a box acknowledging they will "maintain control and responsibility" for the vehicle. And KY3 reported Neally was "totally distracted from driving."

This means Autopilot was being asked by its driver to do more than it is capable of doing safely. If Autopilot had failed Neally, this story could have ended very differently.

No 'win' for Tesla

"This is no 'win' for Tesla," Liebman chuckled. "I wouldn't over do it." Adding, "I'm all for technology ... but I am not sure this is a sign of a unique victory for Tesla."

Liebman is right. We must remember that semi-autonomous technology isn't exclusive to Tesla. Mercedes-Benz's 2017 E-Class, for example, features Drive Pilot, the company's own semi-autonomous driving system comparable to Tesla's Autopilot.

In fact, Drive Pilot is even more robust than Autopilot, technically speaking. However, due to safety concerns, Mercedes (unlike Tesla) won't allow it to assume as much of the driving duties as it is truly capable of.

What's more, Honda offers a similar system to Autopilot called Honda Sensing on its vehicles, including the $26,125 2016 Civic Coupe. The list of other automakers offering semi-autonomous safety systems doesn't end with Honda and Mercedes. Volvo and Audi offer similar systems, too.

The Tesla Model X car is test driven at the company's headquarters Tuesday, Sept. 29, 2015, in Fremont, Calif. Image: Marcio Jose Sanchez/AP

So to hold Neally's Model X on high as a distinctive lifesaver is misleading. Ignoring the fact that it shouldn't have been utilized the way it was, Autopilot isn't alone in its ability to keep drivers safe on the highway. That said, it's the only one that could have performed this task — and that's not a good thing.

Misusing and over-utilizing

This seemingly remarkable headline once again highlights the problem not with autonomous driving technology, but rather the public's view and acceptance of it — especially Tesla owners.

No semi-autonomous systems can currently be relied upon to safely perform all driving duties.

No matter how robust or in control an autonomous driving system might feel, Autopilot — and other systems like it — cannot be wholly trusted or relied upon to fully take over driving duties. This is especially true when the driver is impaired or unable to drive, like Neally was.

Going by Tesla's own Autopilot user agreement, Neally misused his Autopilot system. Despite that, he thankfully survived not just his pulmonary embolism but also his ride to the ER. However, Neally's experience isn't the rule, it's the exception, which happened to not end in tragedy ... this time.

Unfortunately, stories like Neally's perpetuate the dangerous misconception that Autopilot and other semi-autonomous systems can be fully relied upon to do all the driving.

Yes, driverless cars like Google's prototypes, designed to let you ignore the road and safely watch Harry Potter (or suffer a pulmonary embolism) while your vehicle handles driving duties, are coming. They're not here yet, though.

Remember, it's not just your own safety you need to consider, but the safety of everyone else on the road, too. So, yes, your car might be able to drive you to the hospital (or real close) while you're suffering a medical emergency. That doesn't mean you should let it, though.