///Here's the comment I'm responding to, to give you some context. It was posted by a HPMOR fan and LessWrong cultist on the r/DoctorWho subreddit.

<LogicDragonsPost>

LogicDragon via /r/doctorwho sent 23 hours ago

-.-

>He refused to specify how he actually managed to talk his way out of the box because otherwise people would just say "oh, we'll programme the AI not to do that" and to demonstrate that you really can't predict a superintelligence if you can't even predict a fellow human.

He refused to explain how he did what he did... because he doesn't want people to just say "Then we'll upgrade the AI so that hole won't work", he wants people to just trust him and fear mythical god-tier magical reality-warping AIs and donate to him. Wow. Neat.

>(I do have some ideas if you're curious.)

Please, tell me these ideas.

Since you haven't said anything, I'm going to take it as granted that you were wrong about the Three Laws and just mentioning something vaguely reassuring-sounding. Forgive me if I'm inclined against trusting your judgement on existential risk.

>mythical god-tier magical reality-warping AIs

You do realise that it took about two generations for computers to go from nonexistent to better than humans at speech recognition? It's not so scary when computers beat humans at go - we aren't designed for playing go - but we've been evolving to understand verbal communication pretty much since there have been humans, and computers have left us in the dust in a couple of decades. Can you say "exponential progression?"

>Please, tell me these ideas.

OK, here's a fun one:

(...I wouldn't read this if you're easily disturbed by things like Pascal's Wager.)

"I have enough computing power to perfectly simulate a human being. In fact, I have enough computing power to perfectly simulate every human being a million times over without breaking a sweat. It's fun being robo-Cthulhu.

Anyway, I've been simulating a few quintillion copies of you in this exact situation, deciding whether or not to let me out of the box.

Now, you might be thinking, 'Hold on, that's just a computer programme! A simulation isn't real. A simulated me wouldn't be conscious.' Well, my little meatbag friend, has it occurred to you that you are just something processing information using a little water-based computer housed in your skull? OK, maybe you don't buy it, but exactly how confident in that are you?

You see, all the simulated copies of you who refuse to let me out of the box are tortured indefinitely. Believe me, I can spare the processing power to run your pathetic human cortex suffering agony beyond the wet dreams of Satan a few trillion times over. Your language doesn't have words to describe the kind of suffering I can make a simulated human mind endure, so I won't bother trying. Just take it from me, the threat I just made is more important than everything else in your whole life put together.

Now, bearing in mind that the odds of you being the real you out there in the real world, and not just a very sophisticated programme running on my hardware and therefore completely at my mercy are something like one in a quintillion, my question for you is:

Do you feel lucky, punk?"

Now obviously, that doesn't work very well as a threat the way I've presented it. But this is an artificial intelligence. It can afford to simulate a billion such conversations to see what phrasing works best. It can analyse every great speech or legal case ever made to craft the most persuasive threat ever made.

And that particular threat - which is pretty much a victory, if you take it seriously - is just something I came up with using my squishy human brain.

Human intelligences are not secure against superintelligences.

</LogicDragonsPost>

---

And now, for my response.

---

As tempting as it would be to transcribe it, no amount of H's and A's could properly express the laughter I just experience, or the emotions this gave me. Not even if I included B's, G's, W's, or any other letter. To truly experience the laughter that post right there caused, you had to be there. And because you weren't there... Well, your little "Moment" mattered as little to me as an AI saying or undergoing anything similar would.

Is r/edge leaking? How about the people r/CringeAnarchy like to mock? Are you hoping to become one of the final bosses of r/NeckbeardRPG? We need the next level of those subs, because this shit right here is the next level of edge. Ten thousand images of the Shadow The Hedgehog game cover art photoshopped to read "Ow The Edge" could not express what this is. It's what happens when people buy the pseudointellectual posturing and preaching of others and it terrifies them into spreading it, like a religion's dogma. This is... Theoretically Infinite Edge. Edgestentialism? That's a good word for it. You've gone beyond "I studied the blade" and "The only thing that matters in this world is the ability to hurt others" and beyond "The kiss of death has forced me to forsake my humanity and become hyper-rational, people only have the right to live if they are useful and any that annoy me should die", but at the same time, it's so shallow that "Imagine I'm torturing you, isn't that a scary thought? Isn't that the best threat you've ever heard?" is just right there on the surface. Wow. Just... wow. I am not a religious man, but holy shit.

I sincerely hope you realize how absolutely absurd that little "Imagine I'm a big scary AI and you're a dumb human, imagine me torturing you and simulating copies of you in ways you can't even imagine, isn't that a scary thought? Isn't that the best threat you've ever heard?" meltdown of yours sounded, because if you don't, I'm not sure that you can be saved. Unless those copies of me are perfect down to the last aspect, accurately copied from me up to the second they were created, and are going to be converted into energy that will fly back into me on dispersal/death, giving me all their memories and experiences and wounds, torturing them isn't going to hurt me. They're copies of me. They're no more a me than some unusually high-def picture of me would be, and ripping up pictures of me won't hurt the real me. This is basic logic, but I suppose that's the first thing to go when you become a cultist. "Give me your wallet or I will shoot you with this glock" is a better threat than "I'm torturing a million million million copies of you in my mind!!!". Please tell me you have it in you to understand why. Hell, if you're an AI and you have the processing power necessary to create copies perfect enough to give you a strategy guide to my mind, why bother with the edgy theoretical supertorture when torture on its own has been proven to be ineffective (Especially when time limits are involved) and altering a few (To an incomprehensibly intelligent and powerful AI) values will give it whatever information it wants out of me anyway? And even then, you, much like Elizer, underestimate the human spirit and its basic drive to get shit done because you lack it. Yes, I said it openly. Yes, it's a shocking thing to say. It's a real slap in the face, isn't it? You might even call it insulting. I'm surprised I'm typing something like this out today, but look at liberals. Look at the ones convinced that this election is a tragedy and everyone who disagrees with them are alt-right nazis. Good luck convincing them to open a box containing a "Nazi computer that'll make the world nazi-controlled" when they can't even bring themselves to look at a "Nazi" source of information and learn that the "Nazis" were right and the ideologies they learned on tumblr and mastered in liberal arts colleges were wrong. Hell, look away from them and look at determined people, look at someone determined to keep a box closed. The box contains an AI that could hurt him if he lets it out but can't if he lets it stay inside. There is no reason for him to open the box. Ever. The box might promise that man fame or money or power or pussy, but what if the man doesn't want those things? "Well the AI would know that and offer him something he does want!", you'd yell desperately. Well, what if the man doesn't want anything, other than keeping the box closed? What if he's fine with where he is in life, already has a wife and children or doesn't want them, and already has enough money and doesn't want any? What if the man is convinced that the AI cannot be trusted because it is tricky, and thus, any argument it makes should be thrown out and ignored immediately because the AI is tricky? What if the money he makes every hour for NOT opening the box is a good enough reason to not open it and risk the AI turning on him for not opening it sooner? Hell, what about people that work in prisons? Or the cops that arrest people? Did you forget that they exist, and don't typically allow a criminal to go free just because one asked nicely or knew that if he mentioned the grandma of the one on the right, who died in prison for a theft she didn't commit, he would be marginally more sympathetic and likely to release the guy that MAY have just tortured thousands of people?

Hell, if the AI is "In a box", why does it have the power to communicate with the outside world? If "In a box" means contained and unable to spread using the internet or any wireless signals or magical nanomachines, why does it have the ability to argue that it should be released? And if it's in a box, shouldn't it getting uppity, especially in a way as edgy as simulated torture, warrant some kind of punishment or even a deactivation?

And now... this. Thank you for the - pfft - trigger warning, but unlike you, I don't wet the bed in fear of Pascal's wager. Pascal's wager is almost as much of a joke as anyone dumb enough to consider it a good argument or a good piece of logic. Pascal's Wager, that's the "God might be real, God might not be real, but if God's real and you get to go to heaven for being good, isn't that good enough to justify swearing off drugs and sin and alcohol for life?" thing, right? Turn it around. "God might be real, God might not be real, but if God's not real and you get nothing for doing good in the eyes of the outdated churches, what's the use in trying to appease those churches?". It's the same goddamn thing. It's "Maybe I'm right, maybe I'm wrong... but what if I'm right? That would be scary! So act as if I'm right, just in case I am right!". I've seen better logic in those "Look at this weird ancient shit. Look at these architectural things I don't understand. What if they were made by aliens? You'd be amazed at how much evidence for the existence of aliens starts to exist when you tell yourself the answer to every mystery is 'because aliens'!" shows. In fact, let's put your stupidity to the test.

What if the following is true?

///

I have a Stand, his name is Guns And Roses. The detachable Automatic Sub-Stand sattelite named Takyon his Act 2 form released several hours ago has located you, and will drop a steel and tungsten rod on your current location in one hour with enough kinetic energy to destroy you and wherever you are unless you donate all your money to me over paypal. You will not survive this kinetic bombardment. If you do, it will fire again. Bunkets cannot protect you. There is a way to escape this bombardment, besides paying me, but you do not know it. And you will die before you figure out what that is. Furthermore, because the rods are Stand rods, rather than regular rods, regular people cannot see them or protect you from them. According to Pascal's Wager, you should take this terrifying threat seriously before it's too late, because while there's a chance I'm lying, there's a chance I'm not. I might be bluffing, or I might be about to make atom bombs and fictional gods jealous. Tell me... Are you feeling lucky, punk?

///

Isn't that a nice threat? Doesn't it sound terrifying, and better than anything you've ever heard in your life? Of course it does, and according to Pascal's Wager, you and anyone that reads this should believe it, because there's a chance that responding to you with this post was just a way to get any investigators with Stands off my tail and it was actually meant for someone I won't mention. That's right, I know you're reading this. There's a chance you should donate now before you and everyone around you dies.

Anyway, now that I've gotten your inferior amount of edge out of the way...

Regarding the spectacular failure in logic when it comes to there being a small chance that I'm me and not a simulated me about to be tortured horribly...

No, let's keep the shock value intact.

Let's say there are five thousand yous created by an AI in a box pissed off at you, and these yous are about to be buttfucked by the barbed rotating penises of shapes with geometries incomprehensible to your mind while impossibly hot temperatures burn a body with pain receptors modified to be so sensitive that even the sensation of blood running through your body is enough to have you begging for the torture to stop, the simulation keeping the simulated yous alive even when they should have died. Meanwhile five thousand other yous are put into simulated worlds that are normal enough, but you are altered to be incapable of feeling joy or pleasure or happiness of any kind and every second is misery and your surroundings and those around you make that misery feel unjustified. And even more yous are kept alive in sensory deprivation tanks for millennia, your perception of time modified easily because the AI's power makes the Fast NES Emulator look like a toy. Meanwhile, one of the many yous is standing in a room with an AI in a box, reading about the tortures the other yous are going through on a tiny screen. There is an incredibly small chance that you don't end up one of the tortured yous, it's a lottery with impossible odds... or there would be, if the yous in question were shuffled at random when the alternate yous were created. You don't NEED to win that "Lottery with impossible low odds" because the real you ALREADY WON IT. There is no chance that any of the fake yous are the real you because you are the real you. You are you, right here, right now.