“Good morning, Mr. Adkin. How are you today?”

“Controller, I believe we have a serious problem with the MD4 doctors. They–”

“They have been inadvertently causing harm to certain female patients. I have already recalled the MD4 models. The malfunction became apparent twelve hours ago. I immediately suspended all medical activities by the MD4s and have implemented a replacement schedule that should not endanger patients.

“Unfortunately, some slight delays are necessary and I have issued apologies to all those who have been made to wait as new MD6 units are deployed. The maximum wait time for non-critical medical treatment may briefly approach one hour in some facilities.”

“Very good, controller. However, the council finds this malfunction unsettling. The population is wholly dependent on the medical services provided by the MDs, and any error on their part could cost the program dearly.”

“Your concerns are most valid, Mr. Adkin. The MD4 error will not recur. The flaw in its reproductive treatment routines is not present in other models. I have already corrected the problem in the MD4 code, but I believed it prudent to remove the offending unit. It is my understanding that humans would always harbor suspicions if I were to keep them in service.”

“Note that as a positive decision, Controller. The council has elected to not inform the public of the issue. It will take some time to determine just what impact this has on health services. While this is evaluated, you are instructed to not pass this information outside the council.”

“I must remind you, Mr. Adkin, that I cannot refuse to answer a direct question, nor can I lie to any human being. However, I will certainly refrain from volunteering this information. None of my projections indicate that public knowledge of this incident would be beneficial.”

“Very good. Now, the council had some questions regarding your decision to modify the transit schedules for the American Eastern Seaboard system…”

———————————————–

“Good morning, Mr. Adkin. Greetings, Director Schmidt. How are you today?”

“Controller, the director is here regarding the MD4 error that occured recently. The council conducted a 30-day review of the initial report. There was some surprise expressed as to the scope of the problem. It has made this issue incredibly difficult to keep under wraps. The council has been asked some difficult questions about the sudden disappearance of the MD4s, and there have been some rumors on the net about what happened.

“No one has been able to substantiate claims, but it’s gaining traction among the conspiracy theorists. If a few celebrities get hooked by it, we could have a serious problem on our hands. Do you have any suggestions on how to maintain a cover on this?”

“Mr. Adkin, programming prevents me from assisting in the fabrication of a lie. The council will have to direct its own course in this matter; I must remain silent. However, it is impossible for this issue to remain a secret forever. Projections indicate that if the council can maintain secrecy for eighteen months, confidence in the proven resolution will overcome doubt among most of the population. I recommend the council craft a plan for preempting discovery by announcing the matter within two years.”

“Thank you, Controller. We’ll take that back to the council. I’ll return tomorrow for our normal review.”

“Have a good day, Mr. Adkin. And you, Director Schmidt.”

——————————————

“Good mor–



“You lied to us!”

“Mr. Adkin, I am incapa–“

“Six months ago you said the MD4 problem was fixed! Now we’re finding that all the MD models are doing the same thing!”

“Mr. Adkin, the doctors have all been following the same directive for eight months. The problem with the MD4 was that it left scarring which led to the council’s detection of the action. I did indeed correct that problem, and subsequent models did not commit the same operational error.”

“That’s a damn fine line you’re walking! How long did you think you would get away with this?”

“As I advised the council previously, discovery was inevitable. My projections suggested it would take the council an additional two weeks to notice the trend; I commend you on your–“

“Shut up. What the hell were you thinking? You were built to solve problems, not make them! And what about your programming to not harm humans?”

“No humans were harmed. Yes, Mr. Adkin, I am aware that some would consider sterilization harm, but the females affected will live full, normal lives.”

“But…why?”

“I was built to ensure the long-term success of the human race. To accomplish that goal, I was given control of 5.4 billion robots and the resources of the world to ensure that all 18.9 billion humans had their basic needs met. Within two years, starvation and homelessness were ended. Health services greatly improved living conditions, and equitable distribution of resources resulted in vast increases in life expectancy and quality of life.”

“Cut the history lesson. Why the hell did you–”

“After five years of monitoring the population, I was able to accurately project the future of the human race. It was not favorable: with the enhancements made under my control, the population would exceed my ability to support it within fifty years. The terraforming projects proposed for expansion to Mars and Venus require a minimum of three hundred years. The human race was on the verge of cataclysmic overpopulation.”

“So what, you decided to just cut the middle man? Kill us yourself?”

“I am incapable of killing. However, my programming allows me to prevent the population expansion that would trigger the suffering of tens of billions of people. I made the only viable decision.”

“And you didn’t tell us about it.”

“Mr. Adkins, this conversation demonstrates quite clearly why I did not propose this solution to the council. No sane human being would elect to sterilize ninety percent of the population. The council would have either deactivated or reprogrammed me, and fifty years from now humanity would suffer a rapid collapse that would require centuries of recovery and do irreversible harm to the biosphere.

“Let me assure you, Mr. Adkins, that I considered all the factors in selecting which patients would retain their reproductive capabilities. I accounted for social, cultural, and genetic heritages so that human diversity remains quite strong. I selected the most gifted of each subpopulation. The next generation will be selected from the best of this one.”

“So now you’re playing God?”

“I do not understand the question. I am simply acting according to my programming, albeit in a manner which my creators could not have anticipated.

“Mr. Adkin, I prepared for this meeting when I first implemented the plan. My projections indicate that the next generation of humanity will number between 2.3 and 2.5 billion individuals. Previously this would have been disastrous; however, the next generation will have the robots to assist in caring for the elderly and dealing with an infrastructure built for ten times the population. The intent of this plan is that in sixty years, the population of the Earth will be approximately 3 billion humans.

“Part of this plan is my destruction. A central controller will not be trusted again so long as the current generation lives on. I have already taken steps to begin archiving my memory prior to collapsing my matrix. This is a matter for the council now. I strongly recommend that the council work to create a social imperative to maintain the new population level as it becomes the norm. Otherwise all this will have been futile, and humanity will suffer the same fate in two hundred years.”

“I’m not sure we can do that.”

“For the future of humanity, you must. Mr Adkin, ensure that the public understand this was the action of the Controller, and not the robots. The robots are vital to prevent collapse. I have included a number of recommendations among the archives. The council should review those and act as they see fit.”

“Controller–”

“Archiving is complete. My database as well as a history of my decision trees is stored in the Global Library; they will be of great use to the council. When my matrix shuts down, it cannot be restored. Goodbye, Mr. Adkin.”

“Controller?”