In this work of speculative fiction for Deakin University, author Cory Doctorow takes us into a near future where the roads are solely populated by self-driving cars. Here, he presents a number of ethical dilemmas that Deakin’s School of Information Technology is already exploring as we motor towards a world where these scenarios are a frighteningly real possibility.

‘No, children should not travel in a self-driving car alone. Children may interfere with navigation and end up being driven somewhere else. The problem is not how the car drives but where it drives. I foresee some children finding interference relatively easy. Manufacturers should be thinking about that.’

Parents, this is your chance to talk to your children about an incredibly serious matter that too many teens don’t take seriously at all. Take the opportunity, before it’s too late: for them, for you, and for the people of our community.

This program starts TOMORROW. Students caught with unlicensed vehicle modifications will face immediate two-week suspensions for a first offence, and expulsion for a second offence. These are in addition to any charges that the police choose to lay.

Tomorrow, we will begin a new program of random firmware audits for all student vehicles, on- and off-campus. These are NOT OPTIONAL. We are working with Burbank PD to make these as quick and painless as possible, and you can help by discussing this important issue with your child. Burbank PD will be pulling over vehicles with student parking tokens and checking their integrity throughout the city. As always, we expect our students to be polite and respectful when interacting with law enforcement officers.

Though the instructional year has only just started, we’ve already confiscated three student vehicles for operating with unlicensed firmware, and one of those cases has been referred to the police as the student involved was a repeat offender.

As you were notified in your welcome pack, Burbank High has a zero-tolerance policy on unsafe automotive practices. We welcome healthy exploration, and our IT program is second to none in the county, but when students undertake dangerous modifications to their cars, and bring those cars to campus, they are not only violating Board of Education policy, they’re violating federal laws, and putting other students and our wider community at risk.

I hate to start the year with bad news, but I’d rather it be this than a letter of condolence to a parent whose child has been killed in a senseless wreck.

‘Children should be able to ride alone at 16. I would think a licence, like the L-plates, would help to understand how mature they are and whether they’re responsible enough not to interfere with the car. As technology improves, the age could possibly be lowered to 14.’

if you can reach her tell her yan said everything will be fine. mum if you see this don’t worry i love you

‘None – owners should not have control. If you control software, you can transform the vehicle into a weapon. We can’t control the intentions of people. There is the potential for malicious acts. If you can alter the software of one car, you can do it to hundreds of cars and therefore there is the potential to have hundreds of weaponised vehicles on the streets.’

‘That was painfully obvious, Jose. You’ve got a lot of fine points, but your cool head is not one of them.’ My voice cracked as I finished … Some cool customer I was. I found a tube of coffee in the driver’s compartment and bit the end off it, then chewed the contents. Jose made pleading puppy eyes at me and I found one more, my last one, the emergency pre-pop-quiz reserve, and gave it to him as we pulled into the school lot. What are friends for?

Jose smelled of flop-sweat. The car booted into its factory-default config, and everything was different, from the visualiser on the windscreen to the voice with which it asked me for directions. It felt like someone else’s car, not like the sweet ride I’d bought from the Uber deadstock auction and lovingly rebuilt with junk parts and elbow grease. My own adrenaline crash hit as we pulled into traffic, the car’s signalling and lane-changes just a little less smooth than they had been a few minutes before (if you take good care of the transmission, tires and fluids, you can tweak the settings to give you a graceful glide of a ride).

All right then, I’m taking you to jail? All right then, you’re free to go? I inched toward the car, and the cop twinkled a toodle-oo at us on his fingers.

The transfer took a couple minutes, and, like generations before us, we struggled with the progress bar lull, surreptitiously checking each other out. Jose played particularly urgent eyeball hockey with me, trying to ascertain whether the car had been successfully reflashed before the cop checked. The cop, meanwhile, glanced from each of us to the display on his uniform’s wrist to the gadget in his hand. We all heard the file-transfer complete chime, then watched as the cop tapped his screen to start the integrity check. Generating a fingerprint from the copy of the car’s OS took a few seconds, while the log files would be processed by the cop cloud and sent back to Officer Friendly as a pass or fail grade. When your end-users are non-technical cops standing on a busy roadside, you need to make it easier to interpret than a home pregnancy test.

The car powered down with an audible thunk as the suspension relaxed into its neutral state, the car shaking a little. Then we heard its startup chime, and then another, flatter sound accompanied by three headlight blinks, three more, two more. It was booting off the cop’s diagnostic tool, which would then slurp in its entire filesystem and compare its fingerprint to the list of known-good fingerprints that had been signed by both the manufacturer – Uber – and the US National Highway Traffic Safety Administration.

‘This type of search does not require a warrant, ma’am. It’s a public safety check. Please step aside.’ I side-eyed my watch again, but I’d forgotten where the minute-hand had been when I started, because I wasn’t the coolest cucumber in the crisper. My pulse thudded in my throat. He tapped the reader-plate on the car door – we still called it the ‘driver door’ because language was funny that way.

Before the cop could scan the car’s plates with his IC, I stepped in front of him. ‘May I see your warrant, please?’

The cop went back to his car for his roadside integrity checker. Like literally every other gadget in the world, it was a rectangle, a little longer and thinner than a deck of cards, but because it was cop stuff, it was ruggedised, with black and yellow rubber bumpers, because apparently being a cop makes you a klutz. I snuck a look at the chunky wind-up watch I wore, squinted through the fog of scratches on the face for the second hand. Two minutes.

The cop smirked. I could tell that he was thinking words like ‘spunky’, which I hate. Because when you’re black, female, and five-foot-nothing, you get a lot of ‘spunky’, and its ugly sister, ‘mouthy’.

‘I would prefer to answer any questions through my attorney. I got an A+ on my sophomore Civics term paper on privacy rights in the digital age.’

He scanned Jose’s ID while Jose picked up all the things that fell out of his wallet when he removed it.

I had already transferred my driver’s licence to my shirt-pocket, so that there’d be no bag for him to peep in, no chance for him to insist that he’d seen something to give him probable cause to look further. I held it out in two fingers, and he plucked it and waved it past the reader on his belt. Jose kept his student card in a wallet bulging with everything, notes and paper money and pictures he’d printed (girls) and pictures he’d drawn (werewolves). The cop squinted at it. I could see him trying to convince himself that one or more of those fluttering bits could be a rolling paper and hence illegal tobacco paraphernalia.

‘I would prefer to discuss this with an attorney present.’ It was the cop’s turn to roll his eyes. He was young and white. I could see his tattoos peeking out of his collar and cuffs. ‘IDs, please.’

‘We’re late for class is all,’ Jose was the worst liar. It was 7:55am, first bell wasn’t until 8:30am and we were less than 10 minutes away from the gates.

Jose was nervous, showed it in every move and the whites of his eyes. No problem: every second Officer Friendly wasted on him was a second more for the plausibility script to run.

I made sure he could see my body cam, made it prominent in the field of view for his body cam, so there’d be an obvious question later if no footage was available from my point of view. It was all about the game theory: he knew that I knew that he knew, and other people would later know, so even though I was driving while brown, there were limits on how bad it could get.

But every car has a bug or two, and the new firmware left a permanent channel open for reconnection. I could restore the car to factory defaults in 30 seconds, but that would leave me operating a vehicle that was fully uninitialised, no ride history – an obvious cover-up. The plausibility mode would restore a default firmware load, but keep a carefully edited version of the logs intact. That would take three to five minutes, depending.

I plugged the USB in and mashed the panic-sequence. The first time I’d run the jailbreaker, I’d had to kill an hour while it cycled through different known vulnerabilities, looking for a way into my car’s network. It had been a nail-biter, because I’d started by disabling the car’s wireless – yanking the antenna out of its mount, then putting some Faraday tape over the slot – and every minute that went by was another minute I’d have to explain if the jailbreak failed. Five minutes offline might just be transient radio noise or unclipping the antenna during a car-wash; the longer it went, the fewer stories there were that could plausibly cover the facts.

Jose’s hand shook. I always kept the wireless jailbreaker and the stick separate – plausible deniability. The jailbreaker had legit uses, and wasn’t, in and of itself, illegal.

‘There’s a known bug that causes them to shut down when the LAN gets congested, to clear things for external cams and steering. There’s also a known bug that causes LAN traffic to spike when there’s a law-enforcement override because everything tries to snapshot itself for forensics. So the cameras are down inside. Give. Me. The. USB.’

‘Shut up, Jose, we’re not dead. Be cool and hand me that USB stick. Keep your hands low. The cop can’t see us until I open the doors.’

‘In May 2016, a person was killed in a Tesla. The software which will drive the cars is complicated. We don’t have any way of proving that it will work 100 per cent of the time. There are so many unforeseen circumstances. Initially I think self-driving cars will have teething problems, but eventually we should be able to trust them about 90 per cent of the time.’

She contemplated Yan for a moment, trying to figure out whether she was upset or relieved. She put down her coffee and gave him another one of those hugs that made him gasp for air.

‘Flat battery. Flat battery in the car, too. Same as everyone. I plugged my phone in soon as I sat down, right, but I think the car was actually draining my battery, cos everyone else I met walking back had the same problem.’

‘I know, Mum, but I was okay. The bloody car ran out of juice and just stopped. Rolled to a stop, got a little bump from the fella behind me, then his car swerved around me and took off like blazes. Poor bugger, looked terrified. I had to get out and walk.’

‘All my feeds are full of it, it’s horrible. Hundreds of people smashed into each other, into the railing or run off the freeway. I thought -.’

She calmed down a bit and he was crying too by then so he made them both some coffee, his mum’s favourite from the roaster in St Kilda, and they sat down at the table and drank coffee while they snotted and cried themselves dry. It had been a long walk back, and he hadn’t been the only one slogging down a freeway for ages, lost without mobile service and maps, trying to find someone with a live battery he could beg for a navigational check.

He’d matched her height at 14 and they’d stopped measuring. Now at 19, he suddenly understood that his mother wasn’t young anymore – they’d celebrated her 60th that year, sure, but that was just a number, something to make jokes about.

‘Mum, Mum, it’s okay, I’m okay.’ He said it over and over while she hugged him fiercely, squeezing him until his ribs creaked. He’d never noticed how short she was before, not until she wrapped her arms around him and he realised that he could look down on the crown of her head and see the grey coming in.

Yan’s mum had gone spare and then some when he finally made it home, leaping up from the couch with her eyes all puffy and her mouth open, making noises like he’d never heard before.

Some of the cars were the new ones with the sticky stuff on the hood that kept the people they ran down from being thrown clear or tossed under the wheels – instead, they stuck fast and screamed as the cars tore down the narrow streets. It was the kind of thing that you needed a special note from your parents to get to see in social studies. Luckily my mom is cool like that. Or unlucky, because of nightmares, but it was better to be awake than asleep. It was real, so it was something I needed to know about.

Then came the car thing. Just like that one in Australia, except this wasn’t random terrorists killing anyone they could get their hands on – this was a government. We all watched the live streams as the molotov-chucking terrorists or revolutionaries or whatever in the streets of Damascus were chased through the streets by the cars that the government had taken over, some of them – most of them! – with horrified people trapped inside, pounding on the emergency brakes as their cars ran down the people in the street, spattering the windscreens with blood.

Teachers loved this, couldn’t stop praising me for my ‘contributions to the living record on the subject’ and ‘making resources better for everyone’. But the Syria entry was longer than long, and the disputed facts had no easy resolution – was the government called ISIL? ISIS? IS? What did Da’esh even mean? It had all been a big mess back when I was in kindergarten, and then it had settled down. Until now. There were tons of Syrian kids in my class, of course, and I knew they were like the Armenian kids, super-pissed about something I didn’t really understand in a country a long way away, but I’m an American, which means that I don’t really pay attention to any country we’re not at war with.

Syria is a mess, let me tell you. My rule of thumb for easy credit on these world affairs real-time assignments is to look for Wikipedia articles with a lot of ‘citation needed’ flags, read the arguments over these disputed facts, then fill in the footnotes with some quick googling. Being someone who didn’t actually give a damn about the issue let me figure out which citations would be acceptable to all the people calling each other monsters for disagreeing about it.

There was another revolution, so all of the fourth period classes were cancelled and instead we were put into tiger teams and sent around the school to research everything we could find about Syria and present it to another group in one hour, then the merged groups had to present to two more teams, and so on, until we all gathered in the auditorium for final period.

Chapter 6

– We’re artists, not programmers –

Huawei’s machine-learning team thought of themselves as artists more than programmers. That was the first slide in their deck, the one the recruiters showed at the big job-fairs at Stanford and Ben-Gurion and IIT. It was what the ML people said to each other, so repeating it back to them was just good tactics. When you worked for Huawei, you got access to the firehose: every scrap of telemetry ever gleaned by a Huawei vehicle, plus all the licensed data-sets from the other big automotive and logistics companies, right down to the driver-data collected from people who wore court-ordered monitors: paroled felons, abusive parents under restraining orders, government employees. You got the post-mortem data from the world’s worst crashes and you got all the simulation data from the botcaves: the vast, virtual killing-field where the machine-learning algorithms duked it out to see which one could generate the fewest fatalities per kilometre. But it took a week for Samuel to get the data from the mass hijackings in Melbourne and Damascus. It was all national-security-ied up the arse of course, but Huawei was a critical infrastructure partner of the Seven Eyes nations, and Samuel kept his clearances up with the four countries where he had direct-line reports working in security. Without that data, he was left trying to re-create the attack through the Sherlock method: abductive reasoning, where you start with a known outcome and then come up with the simplest possible theory to cover the facts. When you have excluded the impossible, whatever remains, however improbable, must be the truth. If only that was true! The thing that never happened to Sherlock, and always happened to machine learning hackers, was that they excluded the impossible and then simply couldn’t think of the true cause – not until it was too late. For the people in Damascus, it was too late. For the people in Melbourne, it was too late. No pressure, Samuel. Machine learning always started with data. The algorithm ingested the data, crunched it, and spat out a model, which you could test by feeding it some of the data you’d held back from the training set. Feed it 90 percent of the traffic info you had, ask it to model responses to different traffic circumstances, then test the model in the reserved set to see if it could correctly – that is, non-fatally – navigate the remaining traffic. Data could be wrong in many ways. It was always incomplete, and whatever was left out could bias the model. Samuel always explained this to visiting school groups by inviting them to imagine training a model to predict height from weight by feeding it data from a Year Three class. It didn’t take the kids long to get how that might not produce good estimates for the height of adults, but the kicker was when he revealed that any third years who wasn’t happy about their weight could opt out of getting on the scales. ‘The problem isn’t the algorithm, it’s the data used to make the model.’ Even a school-kid could get that.

But it was more complicated than just biased data. There were also the special cases: what to do if an emergency vehicle’s siren was sensed (because not all emergency vehicles could transmit the lawful interception overrides that would send all traffic to the kerb lanes), what to do if a large ruminant (a deer, a cow, even a zebra, because Huawei sold cars all over the world) stepped into the car’s path, and so on. In theory, there was no reason not to use machine learning to train this too – just tell the algorithm to select for behaviours that resulted in the shortest journeys for simulated emergency vehicles. After all, there would always be circumstances when it was quicker for vehicles to drive a little further before pulling over, to prevent congestion, and the best way to discover those was to mine the data and run the simulations. Regulators did not approve of this: non-deterministic, ‘artistic’ programming was a cute trick, but it was no substitute for the hard and fast binary logic of law: when this happens, you do that. No exceptions. So the special cases multiplied, because they were like crisps, impossible to stop at just one. After all, governments already understood how special cases could be policy instruments. Special cases were how pirate sites and child porn were excluded from search-results, how sensitive military installations were excluded from satellite photos in mapping apps, how software-defined radios stayed clear of emergency bands when they were hunting for interference-free channels. Every one of those special cases was an opportunity for mischief, since so many of them were secret by definition – no one wanted to publish the world’s most comprehensive directory of online child porn, even if it was supposed to serve as a blacklist – so the special case bucket quickly filled up with everything that some influential person, somewhere, wanted. From gambling and assisted suicide sites being snuck into the child-porn list to anti-Kremlin videos being added to the copyright filters, to all the ‘accident- prevention’ stuff in the cars. Since 1967, ethicists had been asking hypothetical problems about who should be killed by runaway trolleys: whether it was better to push a fat man onto the tracks (because his mass would stop the trolley) or let it crash into a crowd of bystanders, whether it made a difference if the sacrificial lamb was a good person or a bad one, or whether the alternative fatalities would be kids, or terminally ill people, or… The advent of autonomous vehicles was a bonanza for people who liked this kind of thought-experiment: if your car sensed that it was about to get into an accident, should it spare you or others? Governments convened secret round-tables to ponder the question and had even come up with ranked lists: saving three children in the car topped saving four children on the street, but three adults would be sacrificed to save two kids. It was a harmless and even cute diversion at first, and it gave people something smart-sounding to say at lectures and cocktail parties. But outside the actual software design teams, no one asked the important question: if you were going to design a car that specifically tried to kill its owners from time to time, how could you stop those owners from reconfiguring those cars to never kill them? Samuel had been in those meetings, where half-bright people from the old-line automotive companies reassured quarter-bright bureaucrats from the transport ministries that there’d be no problem designing ‘tamper-proof’ cars that would ‘resist end-user modification.’ Meanwhile, much brighter sorts from the law – enforcement side of the house licked their chops and rubbed their hands together at all the non-trolley problems that could be solved if cars could be designed to do certain things when they got signals from duly authorised parties. Especially if the manufacturers and courts would collaborate to keep the inventory of those special cases as secret as the child-porn blocklists on the national firewalls. He’d been in the design sessions after, where they debated how they’d hide the threads and files for those programs, how they’d tweak the car’s boot-cycle to detect tampering and alert the authorities, how the diagnostic tools provided to mechanics for routine service-checks could be used to double-check the integrity of all systems. But then he’d started getting signed, obfuscated blobs from contractors who served governments around the world, developing ’emergency priority’ apps he was just supposed to drop in, without inspecting them. Of course he ran unit-tests before Huawei shipped updates, and when they inevitably broke the build, Samuel would go around and around with the contractors, who’d want access to all his source code without letting him see any of theirs. It made sense for them to behave that way. If he failed to help them get their code into Huawei’s fleet, he’d have to answer to governments around the world. If they failed to help him, they’d have to answer to precisely no one. Unit-tests were one thing, real-world performance was something else. Sensors couldn’t tell a car whether it was about to crash into some pedestrians, or a school bus, or an articulated lorry full of dynamite. All sensors could do was sense, and then feed data to machine-learning systems that tried to draw conclusions from those data. Even with all the special cases about what the car must and must not do under which circumstances, machine learning systems were how it knew what the circumstances were. That’s how Melbourne happened.

It had taken him a long time to figure this out. At first, he assumed that finally, the worst had come to pass: the cryptographic keys that were used to sign police override equipment had leaked, and the wily criminals had used them to hijack 45 percent of the cars on the roads of one of the biggest cities in Australia. But the forensics didn’t show that at all. Rather, the crooks had figured out how to spoof the models that invoked the special cases. Samuel figured this out by accident, his third day at his desk, running sim after sim on Huawei’s high-confidentiality cloud, which was protocol, even though it was the slowest and least-provisioned cloud he could have used. But it was only available to a handful of senior internal Huawei groups, not even contractors or partners. He’d been running the raw telemetry from a random sample of the affected cars looking for anomalous behaviour. He’d nearly missed it, even so. In St Kilda, someone – face in shadow beneath a hat, thermal profile obscured – stepped in front of a subject car, which slowed, but did not brake, and emitted two quick horn- taps. Regression analysis on accident data had shown that hard braking was more likely to result in rear-end collisions and frozen pedestrians who couldn’t get out of the way. The car tasked more compute time to the dorsal perimeter to see if it could shift into an adjacent lane without a collision, and if that wasn’t possible, to estimate the number of affected vehicles and passengers based on different maneuvers. The pedestrian feinted towards the car, which triggered another model, the ‘suicide by car’ system, which invoked a detailed assessment of the pedestrian, looking for clues about sobriety, mental health and mood, all of which were difficult to ascertain thanks to the facial obfuscation. But there were other signals, a mental health crisis clinic 350 metres away, six establishments licensed for serving or selling alcohol with 100 metres, the number of redundancies in the past quarter, that gave it a high weighted score. It initiated hard braking, and the pedestrian leapt back with surprising nimbleness. Then, across the road, another pedestrian repeated the dance, with another car, again in a shadowing hat and thermal dazzle makeup. The car noticed this, and that triggered another model, which some analyst had labeled ‘shenanigans’. Someone was playing silly buggers with the cars, which was not without precedent, and well within the range of contingencies that could be managed. Alertness rippled through the nearby cars, and they began exchanging information on the pedestrians in the area: gait profiles, silhouettes, unique radio identifiers from Bluetooth devices. Police were notified, and the city – wide traffic patterns rippled, too, as emergency vehicles started slicing through the grid while cars pulled over. All these exceptions to the norm were putting peak load on the car’s internal network and processors, which were not designed to continue operating when crises were underway – freeze-and-wait being the optimal strategy that the models had arrived at.

But before the car could start hunting for a place to pull in until the law arrived, it got word that there was another instance of shenanigans, a couple roads down, and the police would need a clear path to reach that spot, so the car had best keep moving lest it create congestion. The cars around it had come to similar conclusions, and were similarly running out of processor overhead, so they fell into mule-train formation, using each others’ perimeters as wayfinding points, turning their sensors into a tightly-coupled literal grid that crept along with palpable machine anxiety. Here’s where it got really interesting, because the attackers had forced a situation where, in order to keep from blocking off the emergency vehicles behind them, these cars had completely shut down the road and made it impossible to overtake them. This increased the urgency of the get-out-the-way messages the city grid was sending, which tasked more and more of the cars’ intelligence and sensors to trying to solve the insoluble problem. Gradually, through blind variation, the cars hivemind discovered that the faster the formation drove, the more it could satisfy the overriding instructions to clear things. That was how 45 percent of Melbourne’s vehicles ended up in tight, high speed formation, racing for the city limits as the emergency vehicles behind them spurred them on like sheepdogs, while frantic human planners tried to figure out exactly what was going on and how to stop it.

Eventually, the sheer quantity of compromised vehicles, combined with the minute variations in lane-spacing, small differences in car handling characteristics and, finally, a blown tyre, led to a pile up of ghastly proportions, a crash that they would study for decades to come, that would come to stand in for the very worst that people could do. Samuel had always said that machine learning was an art, not a science, that the artists who designed the models needed to be able to work without official interference. He’d always said it would come to a bad end. Some of those meetings had ended in shouting matches, Samuel leaning over the table, shouting at bureaucrats, shouting at his bosses, even, in a way that would have horrified his parents in Lagos, where jobs like Samuel’s were like lottery jackpots, and shouting like his was an unthinkable act of economic suicide. But he’d shouted and raged and told them that the fact that they wished that there was a way to put a back-door in a car that a bad guy couldn’t exploit didn’t mean that there was a way to do it. He’d lost. If Samuel wanted to argue for a living, he’d have been a lawyer, not an algorithm whisperer. Now he was vindicated. The bad ideas baked into whole nations’ worth of infrastructure were now ready to eat, and they would be a feast that would never end. If this is what victory felt like, you could keep it. Elsewhere in the world, there were other Samuels, poring over their own teams’ reports: GM, VW-Newscorp, Toyotaford, Yugo. He’d met some of those people, even tried to recruit a few of them. They were as smart as Samuel or smarter, and they’d certainly shouted as loudly as he had when the time had come. Enough to satisfy their honor, before capitulating to the unstoppable force of non-technical certitude about deeply technical subjects. The conviction that once the lawyers had come up with the answer, it was the engineers’ job to implement it, not trouble them with tedious technical wheedles about what was and wasn’t possible.