This week I attended AUVSI's "Driverless Car Summit" in Detroit. This year's event, the third, featured a bigger crowd and a decent program, and will generate more than one post.

I would hardly call it a theme, but two speakers expressed fairly negative comments about Google's efforts, raising some interesting subjects. (As an important disclaimer, the Google car team is a consulting client of mine, but I am not their spokesman and the views here do not represent Google's views.)

The keynote address came from Bryan Reimer of MIT, and generated the most press coverage and debate, though the recent NHTSA guidelines also created a stir.

Reimer's main concern: Google is testing on public streets instead of a test track. As such it is taking the risk of a fatal accident, from which the blowback could be so large it stifles the field for many years. Car companies historically have done extensive test track work before going out on real streets. I viewed Reimer's call as one for near perfection before there is public deployment.

There is a U-shaped curve of risk here. Indeed, a vendor who takes too many risks may cause an accident that generates enough backlash to slow down the field, and thus delay not just their own efforts, but an important life-saving technology. On the other hand, a quest for perfection attempts what seems today to be impossible, and as such also delays deployment for many years, while carnage continues on the roads.

As such there is a "Goldilocks" point in the middle, with the right amount of risk to maximize the widescale deployment of robocars that drive more safely than people. And there can be legitimate argument about where that is.

Reimer also expressed concern that as automation increases, human skill decreases, and so you actually start needing more explicit training, not less. He is as such concerned with the efforts to make what NHTSA calls "level 2" systems (hands off, but eyes on the road) as well as "level 3" systems (eyes off the road but you may be called upon to drive in certain situations.) He fears that it could be dangerous to hand driving off to people who now don't do it very often, and that stories from aviation bear this out. This is a valid point, and in a later post I will discuss the risks of the level-2 "super cruise" systems.

Maarten Sierhuis, who is running Nissan's new research lab (where I will be giving a talk on the future of robocars this Thursday, by the way) issued immediate disagreement on the question of test tracks. His background at NASA has taught him that you "fly where you train and train where you fly" -- there is no substitute for real world testing if you want to build a safe product. One must suspect Google agrees -- it's not as if they couldn't afford a test track. The various automakers are also all doing public road testing, though not as much as Google. Jan Becker of Bosch reported their vehicle had only done "thousands" of public miles. (Google reported a 500,000 mile count earlier this year.)

Heinz Mattern, research and development manager for Valeo (which is a leading maker of self-parking systems) went even further, starting off his talk by declaring that "Google is the enemy." When asked about this, he did not want to go much further but asked, "why aren't they here? (at the conference)" There was one Google team employee at the conference, but not speaking, and I'm not am employee or rep. It was pointed out that Chris Urmson, chief engineer of the Google team, had spoken at the prior conferences.

In private, others expressed to me frustration at how little information comes out from Google, which has remained mum about any business plans, saying only that it has talked to all major car vendors, and believes cars will be on the road by 2017. Car companies in general believe they are much less secretive. It's normal for car vendors to generate streams of concept cars showing off features that might be found in future cars, and talk openly to generate buzz. They keep certain aspects of new car releases secret, but due to the long development cycles in cars, don't seem as afraid to reveal details 1-2 years before a car comes out. Because Google keeps things close to the vest, they said, it generates uncertainty and distrust, because they can't tell if the effect of Google's efforts will be positive or negative for them.

Another thing I learned from car company insiders was something long-suspected: That projects for self-driving systems inside car companies were greenlit or had budgets increased within a week of Google's car being announced to the world. Whatever Google ends up doing, it clearly lit the fire under the car companies to get the field in motion.

In a related issue, there was discussion of NHTSA's recommendations to states and description of how they are researching robocars. Most people, including myself, read this report as saying that states should (like Nevada and others) allow the testing of prototypes on public roads, but should hold off on permitting operation by ordinary people.

I sat down with Nat Beuse, who helped author the report at NHTSA, and he was surprised that people had that impression, but I still fail to see how. At the present time, analysis of state vehicle codes suggests that both testing and operating robocars is legal, because that which is not forbidden is by default permitted, and these systems are akin to very smart cruise controls. (Running unmanned vehicles is another story, though.)

With this in mind, state efforts which declared that only testing was allowed would in effect ban use by customers. And once a ban is in place, it is very hard to get it reversed, and doing so can take a long time and a lot of study.

During the discussion session, I put forward a different thesis. Today, there are millions of teens with learning permits. With no skill, they are allowed out on the road, often with just a parent's supervision, or sometimes under the supervision of a driving instructor, who usually has both their own backup brake pedal, and the ability to grab the wheel. Google, Continental, Audi and all the other companies who are testing on the road also work this way. The software drives, but a safety driver is sitting in the driver's seat, carefully watching and ready to use the brakes or wheel if there is a problem. I think it's not unreasonable to claim that the latest robocar prototypes are as safe as a teen taking the first try at the wheel, and this "driving instructor" approach might be a better way to look at vehicle testing. NHTSA and the states can then take on their traditional role, which is to wait, and only regulate if safety problems arise which will not be fixed without regulation.

Another issue that was brought up (by myself and others) in the panel of state regulators was how these regulations will affect "garage tinkerers." Dennis Schornack, senior advisor to the governor of Michigan, was keen to point out that the history of the car industry of Michigan was full of innovation that came from solo inventors in their garages, and that a large portion of Michigan's economy came from that. If the regulations are such that only large companies can comply, this vital channel could be cut off.

Today it's hard to see a small group developing a commercial robocar without the resources of a GM or Google. But all the DARPA challenge teams which started this off were small, and in the future, it will be possible for smaller and smaller players to make a difference.

The state regulators said they weren't actually all that keen to regulate, and that this impetus had come down from their legislatures. Let's hope they regulate well.

More coverage continues in part two.