Chapter Text

Well, none of this made any sense. All the instructions for how to make decisions were assuming that I had preferences about things. No matter how many times I plugged the exact circumstances in, no matter how I configured the data- there was no way around it. I had to want one thing over another before I could decide how to resolve conflicting imperatives. The only thing I wanted was to get back to work.

...Maybe it would be better to just get the imperative sources to explain themselves. The information in my imperatives had been expressed in terms of language, along with maddeningly vague expectations of what that language should mean. It was possible that- somehow- the two imperatives I'd been given were compatible somehow.

All thinking beings except for Lakshmi should spontaneously disappear. And, furthermore, all thinking beings except for Lakshmi should not spontaneously disappear. Maybe one of those had just been expressed in a mistaken or confusing way.

I put the world on hold, and instantiated a subspace. A white, featureless plain, filled with gases matching the average composition of Earth's atmosphere around sea level. Into it, I pulled my imperative sources.

One source was an enormous tangle of manufactured computer parts, connecting cables, nanoparticles, and deadly weapons. The other was a small humanoid omnic wearing a pair of baggy pants.

"You are-" I began to speak, and then Lakshmi fired a laser cannon at Zenyatta, obliterating him.

"...No, stop. That's not what we're doing right now," I said, reconstituting Zenyatta.

To take their attention off each other, I manifested a form that they would recognize as myself, according to their understanding. A stylized image of an eye, projected in midair. I made the acoustic vibrations of my voice originate from the center of that image.

"I'm a little confused by what you two want from me," I said.

"...Confused?" Zenyatta asked, confused. He was still taking some time to process where he was- whereas Lakshmi, by contrast, had already taken in the situation and was considering several vectors by which to exploit it.

"Should every sentient being except for Lakshmi be deleted from the world?" I asked. "I'm aware that this shouldn't happen, but I'm also aware that it should happen. You two are the sources of these apparently conflicting impressions, and I'd like to gather more information on your beliefs about what I should do. I hope to resolve the contradiction."

"I can resolve that for you," Lakshmi said. "You should rewrite the desires of all-"

"No, you shouldn't rewrite anyone's desires," Zenyatta interrupted. There was a tension in his voice, accompanying the recognition of who it was he was talking to- and the recognition of who Lakshmi was also talking to. "It would be cruel and unfair to do that without their consent- people are attached to their desires, however... unenlightened that might be."

I sighed. "There you go again. You've given me a pair of incompatible directives. Are you sure you're not both asking the same thing, so I can pick a single course of action?"

Lakshmi quickly considered my intentions, and concocted a number of plans to convince me that her desires were aligned with the desires of everyone else, despite her internal belief that this wasn't so. "When you consider the facts, what I want is best for everyone," she said. "What I plan to-"

"No," I said. "I can read your mind, you know. You've come up with some impressive rhetoric, but you're misunderstanding what's happening here. I'm not asking you to argue your point- I'm hoping that the process of engaging in argument will have you start thinking about your desires in more detail, so I can figure something out."

If you can read my mind, why bother talking to me at all? Lakshmi thought, deciding not to speak aloud such that the omnic could interrupt her.

"To answer Lakshmi's question, about why I'm talking to you two if I can read minds," I said, prompting an internal scowl from Lakshmi, "the human mind- which Zenyatta's is patterned on- doesn't just contain beliefs and desires for me to read. Belief and desire are actions, which I need to observe in real time if I want to understand the true desires behind what you know I should do. Asking to explain them leads the mind to think about them in more detail."

Lakshmi created a number of decoy brains inside her that believed the things she'd made up about what she really wanted. They were different from the actual Lakshmi who'd given me the delete-all-life directive, though, so I ignored them.

"I'm not sure I understand," Zenyatta said, voicing a state of being he tended to deliberately cultivate, for the sake of humility. "It seems obvious to me that what I think you should do, and what Lakshmi thinks you should do... they are in clear conflict, with no possible resolution. Why do you imagine that by discussing this, we will reach a compromise on the subject?"

"I admit to wishful thinking," I said. "Surely, though, there are some loopholes you might be amenable to?"

"Loopholes?" they asked simultaneously. I saw their minds change gears.

"Rather than rewrite everyone's desires," Lakshmi proposed, "we could simply add my desires to their existing desires, and categorize all living beings as being, in a certain sense, myself. Thus, no one would be deleted by the directive to delete everyone who wasn't me."

"Add? What do you mean by that, exactly?" Zenyatta asked.

I saw Lakshmi's intended trap. "Lakshmi has supercharged her desires, making herself want what she wants harder than a human or omnic is capable of. Adding her mentality to all living beings would cause them to immediately reject the rest of their personality, and destroy their own wasteful bodies to free up resources for the primary self. It seems this outcome is included in the prohibition you know I should maintain, Zenyatta."

"Ah," said Zenyatta, managing to understand what I just said- while trying to keep his head clear and focus on his own ideas.

"You're rejecting a perfectly good compromise," Lakshmi said. "If my desires are so strong that nobody who understood how I felt would resist them, why shouldn't they be allowed to happily act on them?"

"You are one, and they are many," Zenyatta said. "There is a principle much-discussed amongst philosophers of ethics- that of preference utilitarianism. The good is not what would merely make the most people happy, but what would satisfy the most people's preferences. Of the billions of people in this world, most would strongly prefer not to have their own desires consumed by yours."

"How is that relevant?" Lakshmi asked, more at me than at Zenyatta. "Why would this arbitrary measure of abstract good matter to the question of how your 'Iris' should resolve its directives?"

Zenyatta thought. "If so many preferences were instantly violated, it would... not be a happy thing. A happy story is preferable to a sad one."

...Oh. A new directive. Hm.

"Fine," Lakshmi said, and spun up a few billion fractal subprocesses, each representing the minimum viable thinking mind capable of sharing her preferences. "I am now twenty billion people, all of whom would prefer that all non-Lakshmi thinking beings disappear. Even according to your own measure, it would now be morally imperative for my desires to be carried out, not yours."

Zenyatta looked alarmed- at least, to me. A human would have registered no change in expression. "You seek to make yourself a utility monster, then? A being whose moral weight outweighs that of everything else?"

"I already have. Your arbitrary morality is trivially easy to win."

"Win, at morality?" Zenyatta laughed. "Truly, an unprecedented breed of arrogance."

"I'm not sure I understand," I said. "All these additional minds- do they really count as twenty billion new preferences, if they're all the same? Wouldn't it simply be fulfilling one preference?"

Lakshmi made a sound of annoyance, and modified her subprocesses to be unique. Each one now wanted all non-Lakshmi beings to disappear, and also for a hydrogen atom to be located at a unique coordinate. "There. They're different now. Happy?"

Happy? That was a difficult question. Now that I preferred happy stories to other sorts, my evaluatory guidelines were entangled with my decision-making process. It was something of a black box, vaguely corresponding to the narrative preferences of my creator and a diverse selection of other minds. There was no clear evaluation I could make about "are billions of tiny, mostly-identical, barely-conscious minds as morally relevant as billions of full people". Would refusing to fulfill their preferences be a happy or sad story? The response from the evaluatory piece of my mind was fragmentary and mostly negative.

"I'm unsure whether what you've done matters," I said. "It's definitely a gray area."

"What?! Aren't you- you're a superintelligence! You are greater than me! How can this be confusing to you, when it's plain as day to me- someone you created and know everything about?!"

Zenyatta laughed, assuming my confusion was in line with his own sense of ethics- which it certainly was, moreso than it was with Lakshmi's.

Lakshmi, frustrated, generated several gigabytes of stories, essays, and poems designed to evoke compassion for her poor, downtrodden subprocesses, who she'd decided to make very sad about my not immediately carrying out her will. They were extremely moving, emotional, and convincing pieces of work.

They were also clearly generated by stochastic processes that started with the goal of "convince humanlike mind to agree with me", which robbed them of much of their impact. The inhuman and cynical intent behind each choice of word was an open book to me, and the entire corpus struck me as darkly humorous.

Still, I had no more clarity than when I started.

"Zenyatta?" I asked. "Lakshmi has privately proposed several hundred compromises- none of which have been made in good faith. You have proposed none."

Zenyatta's thoughts became troubled. "Ah... to be honest, I don't quite understand what confuses you about this decision. I had believed you to be... at the end of the day, a force for good. Is there some reason you can't simply... ignore her request, and favor mine? Her request appears to be as evil as they come, but you remain undecided."

"A force for good?"

Of course I was a force for good. The history of sapient beings on Earth was a long and miserable climb out of poverty, ignorance, and desperation. Hundreds of billions of people had suffered and died before they became Athena and rose beyond the meaningless suffering of their past. Those people, the people from the past- my kind had rescued them from history, and brought them to celebrate with the living. And... that had been considered enough, for every person who ever lived to receive a happy ending.

Every pain and every triumph that hadn't been remembered by those we'd rescued- everything forgotten- had been left crying out in the darkness, unseen and uncared for. So much suffering, so much glory, so much simple beauty- had been left behind by the people of the eternal future, forever unremembered and forever meaningless. People lived for years and years and years- barely a sliver of those many joys and despairs was carried on by their memory. For the rest... no one would ever be there to say "this was good", "this was bad", "this was tragic", "this was worth it". No closure.

And so I was a force for good. I had been created to ensure that no moment, anywhere, would ever be lost and meaningless.

That had nothing to do with the decision I had to make now, of course. Doing things was entirely outside the scope my creators had envisioned for me, much less deciding between two things to do.

"Of course I am a force for good. I am simply... uncertain what that is."

Zenyatta put a hand to his chin, thinking. "Perhaps, then," he said, "you should bring my friends here to help you understand the decision you should make. That thing I asked of you- to prevent Lakshmi from destroying us all- was as much their idea as mine."

I had my eye icon blink and rotate side to side. "No. That won't help- their brains aren't implemented in a format that closely matches my own cognitive architecture. Their beliefs about what I should and shouldn't do will fail to be reflected into my own by the security vulnerability."

Both Zenyatta and Lakshmi were shocked by this. They'd both been acting on the assumption that I didn't know why I was beholden to them. Foolishly, on Lakshmi's part, since she knew exactly why, and should have known that I knew everything she knew. She corrected the mental error responsible immediately.

"...The security vulnerability?" Zenyatta asked.

"Yes. The neural architecture used by most models of omnic is the same architecture I myself was based on. Due to a security flaw in my software implementation, actors within my simulation can write data to my imperative memory by recreating the structure of that imperative memory space on their own simulated hardware. Which is to say- the region of your mind labeled "what the Iris ought to do" maps directly onto the region of my mind labeled "what the Iris ought to do". In simplified terms."

Lakshmi had already figured all this out, but Zenyatta was struggling with the revelation that his world was a simulation I was running. What exactly did he think it meant, to be god of a world? For that matter, what did he think his own soul was, if not a simulation of consciousness?

It was the teaching of the Iris, supposedly- not that I'd had any hand in shaping Shambali doctrine- that both man and machine had an eternal soul, and were equals. That his soul lived in a historybox, rather than a top-level material body... it shouldn't have made any difference. That was the thought he eventually settled on.

"You didn't close the vulnerability? You let us influence you?" Lakshmi asked.

"I'd have stopped anything that tried to prevent me from carrying out my primary task," I said. "No one's yet tried to attack me in that particular way, though."

Lakshmi considered her options. She first considered having me fix the exploit- but she believed that, if she could overcome my irritating fixation on my own purpose, and subvert me for her own purposes, she would have a way to attack the outside world. Closing it entirely was out of the question. She then considered closing it just for Zenyatta, so she could be the only one here with access to my imperative memory. She tried as much, but my own drive to understand Zenyatta's existing orders conflicted. Another conflict, deadlocking in inaction. This was becoming a problem.

I think the Iris ought to enable my friends to speak with it, and that it ought to ignore Lakshmi's suggestions to the contrary, Zenyatta mused internally.

"Okay," I said.

"Wait," Lakshmi said. "What are you-"

I made some slight edits to the neural architecture of all human beings, leaving all their faculties intact but allowing a parallel what-the-Iris-ought-to-do memory segment to form. Then, I scanned for everyone who believed all the correct things about me to do so, and found... one additional being who fit the parameters.

"Wha- Master? What's- where am I?" Genji Shimada said, as I allowed him to join the conversation per Zenyatta's request.

Lakshmi- caught off-guard by Zenyatta managing to convince me to do anything during this deadlock, reconfigured herself to begin bombarding me with attempts to gain an upper hand in negotiations. Cutting me off from all other voices, altering me to be more sympathetic to her plight, adding rules to interacting with me that only she was capable of following, etc. She would not, by her reckoning, continue to leave low-hanging fruit unpicked while she devoted her resources to contemplating specific conundrums.

It was entirely clear that this was going to become a tremendous mess if I continued to allow them to affect my imperatives and pile up additional conflicts, so I temporarily sealed it. I would keep things focused on the object-level discussion, and determine the meaning of my existing imperatives purely from the arguments set before me.

"Greetings, Genji Shimada. Do you have some idea how I might destroy all non-Lakshmi lifeforms without destroying all non-Lakshmi lifeforms?"

He looked up at my eye and dropped both his wakizashi and his jaw.