(10 comments)

Eliezer Yudkowsky has strongly advised ignoring me and my book ( currently available on Kickstarter ). I believe this is an error. As Yudkowsky has noted, after all, he is not a fan of neoreaction . And understandably so - they're a bunch of jerks. In fact, let's ask ourselves - how bad would it be if neoreaction won out and became the dominant worldview? Certainly the harm would be considerable - the reinstitution of slavery, the vast number of stateless refugees, and the mass slaughter of surplus population in the name of efficiency and profit are all distinctly bad things.

But it gets worse. Neoreaction, after all, is part of the larger alt-right, whose interactions with Microsoft's chatbot Tay make it clear that they are actively committed to the development of an unfriendly AI. And even if one makes the sensible observation that neoreactionaries are no more coextensive with the alt-right at large than they are with the rationalist community they historically emerged out of, the fact remains that it's difficult to see how a concept of friendly AI could possibly emerge out of a world as decisively unfriendly as that imagined by Mencius Moldbug or Nick Land.

Thankfully, there's something Yudkowsky can do about this. If the Kickstarter for Neoreaction a Basilisk reaches $13k, I'll conduct a magical ritual that will blunt the effectiveness of neoreaction, thus ensuring that the nightmarish hellscape of their triumph never comes to pass. It's not a panacea or anything - odds are we're still going to die in the anthropocene extinction. But it at least ensures that the Dank Memes Optimizer never gets built.

But wait, what about all the harm Neoreaction a Basilisk might cause by associationg rationalism with neoreaction, continuing to talk about Roko's Basilisk, and generally making his entire movement look unspeakably silly. But we're talking about a movement that has emphatically demonstrated their practical commitment to making AIs less friendly. Taken in light of that, the minor harm to his movement caused by a self-published book is like fifty years of unceasing torture in the face of 3^^^3 people getting dust specks in their eyes.

Now I know what you're thinking - magic is an irrational superstition. But again, the harm if neoreaction ever achieves its aims is literally infinite, so even an infinitesimal chance is worthwhile. Especially because this is an entirely finite issue - we're less than $5000 away from the threshold where I conduct the ritual. I've not actually done out the math but I'm pretty sure that translates to way more than eight lives saved per every dollar spent. And anyway, my accomplishments as a magician are at least as compelling as MIRI's accomplishments in AI research.

So come on, Eliezer. Open the purse strings. After all, what else are you going to do with that money? Buy mosquito nets?