It seems like every other week an autonomous driving tech company (or three) is making headlines with fantastic claims of self-driving commercial truck routes hauling everything from butter to beer to refrigerators. Reading today’s tech news you’d think a living, breathing truck driver is soon to be a thing of the past.

But how much of the hype is actually true? And is it legal (and safe) to use highly autonomous technology on public highways?

The Society of Automotive Engineers designate six levels of autonomy when it comes to motor vehicles. Levels zero, one and two are considered “support features” that require the operator to drive and constantly supervise the features.

Lesser autonomy has been in use for years – cruise control, automatic braking, collision control – these are all forms of low level autonomous technology that have become commonplace in both passenger vehicles and commercial tractors.

SAE autonomous levels three through five assert the operator is not driving while these features are in use even if they are sitting in the driver’s seat. Level four and five autonomy will not require you to take over and have the ability to pilot the vehicle everywhere, in all conditions.

Self-driving start-up Plus.ai splashed all over tech and trucking news late last year when they reported transporting a load of Land O’ Lakes butter from Tulare, Calif., to Quakerstown, Pa., claiming “the first L4 U.S. cross-country commercial pilot hauling a fully-loaded refrigerated trailer of perishable cargo.”

Who is Plus.ai?



According to their LinkedIn page, “Plus.ai is a world leader in self-driving truck technology. Headquartered in Silicon Valley with R&D offices in China, it was founded in 2016 by a group of serial entrepreneurs and industry veterans with over 20 years of experience in high tech and artificial intelligence.”

The FMCSA Carrier Registration online directory lists the company as having 12 power units and has six drivers. As of March 10, their out-of-service rate runs 9.1%.

For perspective, the national average is 5.5%. The above-average rating can be attributed to the small number of units and one particular out-of-service violation issued during a Utah Level-I inspection in June 2019.

There were three citations issued on that date. Operating a CMV without a CDL, which falls under driver fitness in the SMS breakdown, and an additional charge of driving beyond eight-hour limit, since the end of the last off-duty or sleeper period of at least 30 minutes also weighs heavily into the equation.

Of note but not used in the SMS equation, three of the 11 recorded inspections for Plus.ai commercial tractors have resulted in improper size, location or color of CMV marking violations, two of which occurred on the same truck within 90 days of one another.

This was also the truck that was of late used in the cross-country autonomy experiment.

At some point late in the fall of 2019, Plus.ai boldly wrapped the truck and trailer with their logos, displayed their DOT number correctly, installed cameras and light detection and radar sensors (lidar) on a 2019 International LT625 and set out to make what they considered to be a better butter delivery.

The 2,800 mile route via I-15/I-70 was said to be completed in “less than three days,” which requires covering a distance of roughly 933 miles a day. This begs the question of whether or not the delivery was better or just flat-out illegal in more ways than one.

Are autonomous vehicles exempt from hours-of-service or even mentioned in the FMCSA regulations at all?

The short-term answer, published on the FMCSA website states:

“Based on FMCSA’s preliminary assessment of its safety requirements and the potential of ADS-equipped vehicles, the Agency believes individuals responsible for taking control of an ADS-equipped vehicle on a public road should be subject to the current driver-related rules.”

The long-term answer is, this is a very complex situation that will require consideration and alteration to the current set of FMCSA regulations almost entirely. Currently, companies are able to exploit distinct gray areas because of the lack of enforcement all in the name of advancing technology and commerce, not to mention raising billions of dollars for research and development investment.

When pressed to explain the hours-of-service equation used for the heavily hyped butter trip, Plus.ai COO and co-founder Shawn Kerrigan issued this statement through the company’s public relations contact, Lauren Kwan.

“Safety is top priority for Plus.ai. Our cross-country commercial freight run was accomplished with a driver and operations specialist in our truck at all times, and to ensure compliance with hours of service requirements several drivers and operations specialists switched off during the trip. The key point of the cross-country drive was to demonstrate the maturity, safety and reliability of our autonomous driving system, which ran continuously for the three day, 2,800-mile journey.”

Just to be clear here – previous media outlets indicated “a” driver and “a” safety engineer and made no mention of using multiple teams to complete the trip while in observance of the federal hours of service. It bears to be noted that job postings on Plus.ai’s website do not require safety engineers to hold a commercial driver’s license but do require a clean driving record and driver’s license.

In addition, Kwan did confirm that the vehicle contained an ELD. She declined comment on what make or model of device.

Efforts to determine if any exemptions were offered to Plus.ai regarding hours of service and use of level 4 autonomous technology for this particular trip were referred to Duane DeBruyne, deputy director of communications/media relations of the agency, who responded via email:

“The agency is has been contact with the company to obtain more information, but based off of the descriptions in the media reports, we believe relief drivers we were used and manually operated the vehicle during portions of the run, and thus were compliant with hours-of-service requirements.”

Kerrigan’s assertions that the autonomous driving system operated “continuously” conflict with DeBruyne’s statement that relief drivers manually operated the vehicle during portions of the run.

Is it a case of asking for forgiveness instead of permission to use the public motoring community to “train” artificial intelligence until there are specific federal safety regulations in place? Possibly.

Common public knowledge and information about testing these vehicles has been mostly after-the-fact. The question of how safe it is to operate high-level autonomy on the same roads used by private vehicles and inexperienced drivers has been left up to the same legislators who have made grave errors in judgement regarding commercial vehicle rules and regulations.

If we defer to the individual states and specific laws regarding high-level autonomous commercial vehicles we run into the same issues. According to the National Conference of State Legislatures, information regarding individual state law and the use of public highways for high-level autonomous testing, there are 29 states that have enacted legislation.

To further muddy the legal waters each state has a varying degree of tolerance for fully autonomous testing on public roads.

Of 10 states the butter delivery ran through, only two are devoid of any autonomous policy whatsoever.

According to Missouri Department of Transportation Motor Carrier Services Program Manager Matthew Keifer, “The state of Missouri has no laws on the books that would prohibit the use of autonomous vehicles in our state. It seems we do not explicitly state it is allowed but at the same time there is nothing prohibiting it. At this time we have no regulations on autonomous vehicles.”

Kansas has no mention of autonomous policy that could be verified, and multiple requests for verification from Kansas DOT went unanswered.

Illinois requires advanced notice and pre-qualification while Ohio has designated corridors of highway set aside. The differences go on for each and every state policy.

Regardless of what may be on the books, it’s clear that a truck rolling through any of these state weigh stations without a driver would be immediately pulled around back. A random selection of 10 scale houses throughout the reported route were contacted by Land Line and asked what procedure would be observed if an unmanned truck rolled through the scales. Unequivocally, the answer was to pull it and shut it down for further investigation.

At least one industry expert says there are overstatements in capabilities of autonomous technology regarding commercial vehicles.

Professor Missy Cummings of Duke University’s Pratt School of Engineering makes no bones about her thoughts on the butter run hype.

“This news story is very similar to what others have claimed (I think Anthony Levandowski did something similar years ago). I don’t see this as ground breaking or really anything to raise an eyebrow about,” Cummings said in an email to Land Line.

Autonomous technology is clearly making strides, but it still needs an abundance of human intervention and likely will for some time. It’s safe to say we will not have comprehensive legislation on either federal or state levels within the next decade, nor will our infrastructure progress at a rate necessary to turn fleets of completely unmanned, fully loaded commercial vehicles out on public highways any time soon, no matter what the hype says.