The preparations had gone on all the previous night. Installing an empathy function into the Maybot’s operating system had proved a great deal trickier than the administrators had imagined. Time and again, the update had appeared to load satisfactorily only to end with a spinning circle of doom. Finally, just minutes before the Maybot was about to begin her first broadcast interview since the general election, an engineer had found a way to bypass the glitch. It wasn’t perfect but it would have to do.

“I’m very pleased to be joining you,” the Maybot told Radio 5 Live’s Emma Barnett. She didn’t sound particularly pleased, but being pleased was what she had been told the occasion demanded. Being pleased also conformed to Isaac Asimov’s second law of robotics. A robot must obey orders given by human beings except where such orders would conflict with the first law.

Barnett did not sound entirely convinced by the Maybot’s expression of pleasure, but let it go. When did she realise the election campaign wasn’t going to plan? This question threw the Maybot into some confusion, as she still wasn’t entirely sure that the election campaign hadn’t gone to plan. Yes, it hadn’t gone perfectly, but she couldn’t put her finger on anything she had actually got wrong. All the incoming data she had received up until the exit poll was published had indicated she was on track for a resounding victory.

'Great repeal bill' human rights clause sets up Brexit clash with Labour Read more

“Did you shed a tear at the result?” asked Barnett, going for an early money shot. Proof that the Maybot had some human attributes would make headline news.

Had it been a tear or had she just sprung an oil leak? The Maybot hesitated. She couldn’t quite remember. But never mind. “Yes, a little tear,” she said. A tearette.

“Was there any moment when you thought about stepping down?” Barnett continued.

The Maybot looked confused. What kind of idiot question was that? Hadn’t Barnett studied the third law of robotics? A robot must protect its own existence as long as such protection does not conflict with the first or second laws. Of course she hadn’t thought about stepping down. She had won the election and it was her duty to form a government, otherwise she might self-destruct and be scrapped.

Even though the result had not turned out anything like she had been led to expect, she would still do everything the same all over again. Apart from the bits she would change. It hadn’t been her fault that some of the messaging of the Conservative campaign had been a bit off, as she hadn’t been in charge. She was only the prime minister, after all.

“Aren’t you out of touch?” an incredulous Barnett observed. Not at all, the Maybot insisted. She definitely didn’t regret calling the election because the Tories had gone on to win Mansfield. No Tory leader had managed that in years. And now she was back in No 10 she had proved her humility, by showing she had listened to young people’s concerns about the lack of housing by putting absolutely nothing about social housing in the Queen’s speech.

'Chocolate orange' Brexit warning is overdone, says minister Read more

Although Barnett had teased out some flickering signs of personality, she still had doubts about the Maybot’s humanity. She gave it one last go by asking if she was a feminist.

“Er ... I have said that before,” the Maybot replied unsteadily, as her circuits began to overload and fail. She was soon back on restored factory settings. Lifeless, deadbeat and only able to talk in the most mindless generalities.

The deal with the DUP was worth it because it gave the country a strong and stable government and, besides, the £1bn wasn’t that much because the economy was growing so fast it appeared to be contracting. Brexit was going to be a success because if you closed your eyes and hoped for something hard enough it invariably came true.

Everything she had ever done had definitely been the right thing to do, because doing the right thing was what she was programmed to do. The first law of robotics made that clear: a robot may not injure a human being or, through inaction, allow a human being to come to harm.

“What would you say now to your 16-year-old self?” Barnett concluded. The Maybot struggled with this. “Gosh,” she exclaimed. “Believe in yourself? Do the right thing?” Wrong. The correct answer was “be careful what you wish for”.