Humans have developed a complicated process for developing trust in other humans and systems operating under the direct control of humans. Obviously, these processes are far from perfect and sometimes we pay a heavy price for trusting someone whom we should not have trusted. In all developed societies, there exists a legal system to act as a deterrent for betraying trust of a fellow human being by another.



Autonomous vehicles (e.g., cars, trucks, trains, airplanes, boats) technology is maturing at a rapid pace. These systems will most likely operate without direct human supervision. Most complex autonomous systems will be controlled by millions of lines of software. Even programmers who wrote the code cannot tell how the system will behave in certain unusual situations.



In light of this challenge, many people are beginning to ask: what do we need to do to ensure that we can trust autonomous vehicles? Currently, this question is mainly being asked by engineers and efforts are being made to come up with solutions. These solutions are likely to be expressed in complex technical terms. This won’t help in convincing the general public to trust these vehicles.



We, engineers, are not very good at explaining technology-related issues to the general public. Some of you might think that this in an understatement of epic proportions! I agree. We are woeful in communicating with the general public! I will be bold and attempt to make an effort to articulate trust issues in easy-to-understand terms. This post explores how humans develop trust in complex new situations and attempts to break down trust into its constituent ingredients so that, hopefully, the general public can begin to participate in this discussion.



Let us begin with a simple thought experiment to better understand how we make decisions about the trust. Imagine that you have landed in a new country. You do not speak the local language. It is dark outside and the weather does not look good. This country is notorious for its poor roads. Your hotel is far away from the airport. Your flight was late and so you have missed the last bus from the airport to the city.



A person approaches you and offers you taxi service. You are communicating by gestures. You are really worried if this person is able to understand you. You certainly do not want to go to the wrong hotel in the middle of the night. Fortunately, you find a local teenager hanging around at the airport who knows English, so you use that teenager as an interpreter. Should you accept the taxi ride from this person?



Here is the first question that might cross your mind. Will this guy and his car safely take you to your hotel in a reasonable amount of time? This question in turn breaks down to the following three questions:

Is the driver competent? (I have been driven by taxi drivers who can induce a heart-attack by their driving style!)

Is the vehicle reliable? (I have seen taxis that have copious amounts of duct tape holding cracked windshields. I have seen Mythbuster episodes that have demonstrated amazing qualities of duct tape, but I am not comfortable seeing duct tape on windshields.)

Is the vehicle safe? (There are taxis drivers out there with the motto – “seatbelts are for wimps”.)

If the country where you have just landed has an adequate driver licensing system and vehicle inspection program, then you should probably not worry too much about the above mentioned questions. However, you may want to thoroughly inspect the taxi yourself before getting into it.



The next question on your mind probably would be whether or not you will be overcharged for the ride. What can go wrong? The driver might take a long route and charge you unreasonable amount of fare. He might stop at a rest stop where he gets a free meal and you are forced to buy an expensive, lousy sandwich just to get privilege to use the restroom. Clearly this would not be fair.



There are regions in the world where kidnapping/robbery is a genuine concern. How would you know that this driver is not an imposter? Who knows, you might get in a serious trouble for riding in this taxi and end up in a dark hospital room with a kidney missing. Hopefully, verification of the authenticity of the driver is the next thing on your mind.



Unfortunately, you were unable to exchange dollars for the local currency. Please remember in this scenario, your flight was late, and so the foreign exchange counter was closed The driver says that he will accept your credit card if you let him make a copy of your passport to verify your identity. How do you know that your passport information will not be stored in some unsafe fashion? You should be concerned about protection of your private information in this transaction.



Hopefully, the country has a good legal system which will act a deterrent for the driver to rip you off in a blatant manner. Hopefully, you are aware of news related to this country, and if the kidnapping and/or robbery rate was high and posed a real risk, then you would have heard about it. Probably you talked to your friend who visited this country last year and ask for his impression. You would have probably used a combination of (1) first-hand examination, (2) existence of deterrent, (3) reputation, and (4) referral to make your decision.



If you decided to sit in that taxi that stormy night, you decided to trust that driver and his taxi. Here is how you arrived at that decision. You were convinced that you were able to successfully communicate with the driver (e.g., he understood your destination and payment method). Implicitly, you have estimated that the probability is very high that driver and taxi will exhibit acceptable level of (1) competency, (2) reliability, (3) safety, and (4) fairness. In addition, you have assessed that the probability is very low that the driver is an imposter and the probability of your private information falling into the hands of unsavory characters is also very low.



We can extract the following universal building blocks of trusts from the above described scenario that are applicable to autonomous vehicles:

Unambiguous Communication: You are not going to trust a system if you cannot get it to comprehend your intentions and understand what it is trying to do. Competency: You will only trust a system if it performs as expected (and hopefully as advertised). Reliability: You are unlikely to trust a system if it is unreliable. Safety: You will not trust a system if an accident or malfunction poses a serious safety risks. Fairness: You will not trust a system if it tries to take advantage of you. Authenticity: If you are worried that the system is a counterfeit, then you are not going to trust it. Protection of Privacy: If the system makes your private information vulnerable, then you should not trust it.

In addition to the above seven trust ingredients, if you are dealing with a system that includes a computer connected to the Internet, you need to worry about cyber-attacks. You should add the following item to your list of trust components:

Vulnerability to Cyber Attacks: If the system can be easily hacked, then you should certainly be very concerned and think hard before trusting it. If a system is capable of movement and it goes haywire due to malware, it can cause a serious damage by banging into things.

We have looked at what attributes a system should have for us to trust it. The next question is how we implicitly or explicitly estimate these attributes. In other words, what is the process for building trust?



Let us consider another example. Your neighborhood is considering the acquisition of autonomous vehicles for picking up garbage and cleaning streets. You need to vote on the proposal. Your vote basically represents an expression of trust in the proposed autonomous vehicles. Here is what you might be thinking as you are getting ready to make that decision:

First Hand Experience : Your neighborhood ran one week long trial before the vote and you were able to see these vehicles in actions.

Reputation : The system has been used at several cities for two years. Fortunately, no serious accidents were reported. All reviews have been positive.

Referral : Your friend from a neighboring city is raving about it. She was initially worried that these vehicles might pose serious risks to pets and children who walk on the side streets. However, she changed her mind. These vehicles seem to “see” everything around them and react appropriately.

Regulations: There are regulations in place that govern safety of these autonomous vehicles and ensure that these vehicles operate at safe speeds and follow all traffic rules. Vehicles have been tested extensively by a third party to conform to these regulations.

As you observed these vehicles during the trial phase, you probably paid attention to the following three characteristics:

Repeatability and Consistency : Vehicles follow the same pattern every day. If vehicles do something very different every day, it will be difficult to know if they are operating as designed or malfunctioning.

Predictability : Vehicles react to obstacles in predictable ways so that people around them can learn to anticipate their behaviors and react accordingly.

Communicating Decision Making Rationale: Vehicles should be able to explain their decision making rationale so that people know why vehicle took a particular action.

A feasible way to develop trust in software components is to make them open source so that they can be audited by crowds. Hackers should be given financial incentives to find and report vulnerability. That way rather than using their genius for destructive purposes, hackers are being encouraged to contribute to the debugging cause.



There is a good chance that I missed few important ingredients of trust and processes for building them. We really need to start paying attention to trust issues during the vehicle development phase and start engaging and educating the general public. Otherwise, this wonderful technology is not likely to gain market acceptance.



I would like to hear your thoughts on “How to Develop Autonomous Vehicles that Engender Trust from the General Public?”