In the aftermath of the NSA spying revelations, our society is struggling to equip itself with the laws and public understanding necessary to deal with the spread of technology into every corner of our lives.

Self-driving cars are one place we can start to get it right. They provide yet another example of the challenges to autonomy and freedom brought by technology, and have the potential to bring the debate home for people who don’t feel as concerned by privacy issues related to email and laptops.

On Tuesday, Google unveiled a proper self-driving car, with no steering wheel, no brakes, no pedals. Google expects these no-hands-on-wheel cars to hit the roads in 2017 and it is up to us to craft the laws and policies that will govern their use. Such decisions cannot be left for tomorrow. As Google’s working prototype reveals, the robocars of the future are here. And because people have a long history of projecting personal freedom and autonomy onto automobiles, they will have an innate understanding of the stakes.

#### Camille Francois ##### About [Camille Francois](https://twitter.com/camillefrancois) is a Fellow at the Harvard Berkman Center for Internet and Society. Her research focuses on cyberwar and cyberpeace, and related issues in surveillance, privacy and robotics. She is also a Fulbright Fellow and a Visiting Scholar at the Columbia Arnold A. Saltzman Institute of War and Peace Studies.

Case in Point: The Computer Parked in My Grandmother's Garage

My grandmother is an inspiring, smart, and fiercely independent 80-year-old Frenchwoman. Recently, she mailed me a folded up New York Times article entitled “Close the N.S.A.’s Back Doors.” I called her, eager to know why she suddenly cared about backdoors. We often have passionate political conversations, and I have tried many times to drag her into discussing the Snowden leaks and their consequences to no avail. She explained she had happened to see “N.S.A” in the article title, and knowing it might interest me, slipped it in the letter she had just finished writing. Was she concerned about NSA back doors, I asked? No.

Her lack of understanding and concern can be blamed on at least two things: poor metaphors and a narrow view of computing. For non-techies, a “back door” is not a very helpful metaphor. Technically, a back door is a method used to bypass the normal authentication process to secretly and remotely access computers. In the New York Times article, the term is mainly used in its broader sense, “to describe a range of policies and practices whereby governments compel, or otherwise get the cooperation of, private sector companies to provide access to data they control.”

>When my grandmother starts to consider technologically-enabled constraints on how she can drive; or people knowing exactly where she can go—abstract issues of “autonomy” and “privacy” become much more real.

“So someone could access your computer without you seeing it, like if someone entered by the backdoor when you were watching the front door,” I tried to explain to my grandmother. “Nonsense,” she told me, "if there was a backdoor in my house, I would see it. It’s my house after all–the thing you’re describing rather sounds to me like someone having a spare key of my house without me knowing it.”

A back door, indeed, is a programmer-centric metaphor: from inside the code, you can tell there are different ways to get in. It’s not a user-centric metaphor: by definition, from outside the code, the user can’t see it. That makes it challenging to discuss.

This brings us to the second problem: a narrow view of computing. My grandmother owns a desktop computer she uses once a year. It sits in a corner far from her daily life. Miss Teen America fears people activating her camera remotely on her laptop to take naked pictures of her: my grandmother doesn’t. Nor does she own a smartphone that could make her worried about remote access and GPS tracking. For people who envision their personal computers as crucial means of empowerment, freedom and autonomy—most of my hacker friends for instance—back doors are a fundamental source of vexation. My grandmother is not one of them. She isn’t as emotionally invested in these tools so potential vulnerabilities to them do not threaten her way of life. And she is not alone. This is sometimes the case even for people whose opinion on the topic matters supremely: in 2013 for instance, Justice Elena Kagan revealed that eight out of the nine U.S. Supreme Court Justices don’t use email at all.

However, in my grandmother’s daily life, there is a computer that is filled with software, that is connected to the network–therefore potentially remotely accessible–and that she does consider to be the single most important source of her personal autonomy. She just doesn’t think of it as a computer, because it is her car. As many of the cars on the market and on the roads these days, it is packed with driving assistance software and a little antenna on its roof. Her car, literally, is also a computer.

My grandmother never told me: “You have to learn software programming, it is how people become independent, autonomous, and take control of their lives in the 21st century.” Instead, she spent a great deal of time forcing me to take driving lessons, “because this is how a woman gets her independence, control, autonomy and freedom in life.” I still don’t have my driver’s license and she is very mad at me for it. Growing up, she had me read Françoise Sagan’s novels, and we would watch Thelma and Louise together. She is French but she shared this very American wisdom with me: the automobile will set you free.

It's quite clear: for most people, the link between government surveillance and freedom is more plainly understood by cars, rather than personal computers. As more and more objects become connected to the Internet these questions will grow in importance.And cars in particular might become, as Ryan Calo puts it in a 2011 article on drones, “a privacy catalyst”; an object giving us an opportunity to drag our privacy laws into the 21st century; an object that restores our mental model of what a privacy violation is.

When my grandmother starts to consider technologically-enabled constraints on how she can drive; or people knowing exactly where she can go—abstract issues of "autonomy" and "privacy" become much more real.

She started looking more into it. Typing 'Privacy' and 'Cars' in a search engine, she quickly found Ford Global Vice President of Marketing Jim Farley’s declaration at the Consumer Electronics Show: “We know everyone who breaks the law, we know when you're doing it. We have GPS in your car, so we know what you're doing.” In her head, she started re-writing Thelma and Louise in an age of self-driving cars with remote government access for law enforcement purposes. Surely the girls would have been located and arrested, or their cars remotely stopped. "Well, that would make it a ten-minute movie, a YouTube videoclip?,” she joked. Farley’s statement stirred a bit of a public debate, with Democratic Senator Al Franken of Minnesota questioning Ford about its data policies, Farley retracting his statement, competition positioning themselves on the subject and Ford CEO Alan Mulally calling for boundaries and guidelines to operate in this space.

Now, my grandmother understands and cares about this issue. And that is important because in order for our society to shape the rules that will make the future of self-driving cars one in which we want to live, we need all members of society to contribute to the conversation.

We need to ask: what happens when cars become increasingly like computers? With self-driving cars, are we getting the best of the computer industry and the car industry, or the worst of both worlds?

"Self-driving” is another misnomer. Driving decisions are never "self-made." They are accounted for by algorithms when they are not accounted for by drivers. These algorithms reflect many decisions that aren’t self-made either: they are the conscious answers to complicated safety, ethical, legal and commercial dilemmas. Calling a robotic car “self-driving” diverts attention from the surrender of autonomy to algorithms, making it harder to navigate the policy questions that arise.

Self-driving cars are coming–slowly and progressively, with various stages of automation before the streets are filled with no-hand-on-wheel vehicles like the prototype Google revealed Tuesday–but they are surely part of our near future. They hold considerable promise for the environment and for road safety.

They also embody our debate on freedom, autonomy, and privacy when it comes to computing systems–revealing just how intrusive remote access to computing systems by the government or individuals can become.