(10 comments)
Eliezer Yudkowsky has strongly advised ignoring me and my book (
currently available on Kickstarter). I believe this is an error. As Yudkowsky has noted, after all,
he is not a fan of neoreaction. And understandably so - they're a bunch of jerks. In fact, let's ask ourselves - how bad would it be if neoreaction won out and became the dominant worldview? Certainly the harm would be considerable - the reinstitution of slavery, the vast number of stateless refugees, and the mass slaughter of surplus population in the name of efficiency and profit are all distinctly bad things.
But it gets worse. Neoreaction, after all, is part of the larger alt-right, whose interactions with Microsoft's chatbot Tay make it clear that they are actively committed to the development of an unfriendly AI. And even if one makes the sensible observation that neoreactionaries are no more coextensive with the alt-right at large than they are with the rationalist community they historically emerged out of, the fact remains that it's difficult to see how a concept of friendly AI could possibly emerge out of a world as decisively unfriendly as that imagined by Mencius Moldbug or Nick Land.
Thankfully, there's something Yudkowsky can do about this. If the Kickstarter for
Neoreaction a Basilisk reaches $13k, I'll conduct a magical ritual that will blunt the effectiveness of neoreaction, thus ensuring that the nightmarish hellscape of their triumph never comes to pass. It's not a panacea or anything - odds are we're still going to die in the anthropocene extinction. But it at least ensures that the
Dank Memes Optimizer never gets built.
But wait, what about all the harm Neoreaction a Basilisk might cause by associationg rationalism with neoreaction, continuing to talk about Roko's Basilisk, and generally making his entire movement look unspeakably silly. But we're talking about a movement that has emphatically demonstrated their practical commitment to making AIs less friendly. Taken in light of that, the minor harm to his movement caused by a self-published book is like fifty years of unceasing torture in the face of 3^^^3 people getting dust specks in their eyes.
Now I know what you're thinking - magic is an irrational superstition. But again, the harm if neoreaction ever achieves its aims is literally infinite, so even an infinitesimal chance is worthwhile. Especially because this is an entirely finite issue - we're less than $5000 away from the threshold where I conduct the ritual. I've not actually done out the math but I'm pretty sure that translates to way more than eight lives saved per every dollar spent. And anyway, my accomplishments as a magician are at least as compelling as MIRI's accomplishments in AI research.
So come on, Eliezer. Open the purse strings. After all, what else are you going to do with that money? Buy mosquito nets?
Neoreaction a Basilisk,
a work of philosophical horror and horrible philosophy, is available on Kickstarter through the month of May.
Share on Facebook
Comments
Carl Churchill 4 years, 9 months ago
Why throw in that silly mosquito nets jab in the end?
Link | ReplyElizabeth Sandifer 4 years, 9 months ago
Really? That's the jab that goes too far?
Link | ReplyDevin 4 years, 9 months ago
Because the rest of the post was so unutterably serious and it needed something to lighten it up?
Link | ReplyDavid Gerard 4 years, 9 months ago
That's your "too far"?
Link | ReplyTim B. 4 years, 9 months ago
Mosquito nets are a cheap and effective way of helping the 3.2bn people in the actual world at risk of actual malaria.
Link | ReplyRoko's Basilisk is a SF concept that can be dismissed in so many many trivial ways it's just embarrassing that apparently intelligent people are wasting their time & resources on an idea that they believe something far more intelligent than them will subsequently waste their time & resources on.
Seriously I've just read the Rational Wiki page on Roko's and quite frankly whilst I'm not big on offending idiots fuck those guys.
Tim B. 4 years, 9 months ago
Doesn't appear to be an edit/delete function (only the 'their' was supposed to be in italics for emphasis) but whilst I'm here I'll add that what is it about prioritising a theoretical future apocalypse over an actual current one seems to be a thing abut the right? - gun control for example.
Link | ReplyAnd whilst Captcha confirms I'm not a robot, it doesn't deal with the issue of whether I'm not a future AI simulation...
David Gerard 4 years, 9 months ago
To be fair, Yudkowsky does not believe in or endorse the Basilisk idea.
Link | ReplyTo also be fair, Yudkowsky does believe in and advocate almost all of its prerequisites.
And to be absolutely clear, Yudkowsky has explicitly stated that giving him money for AI risk is more important than the mosquito nets.
Aberrant Eyes 4 years, 9 months ago
"what is it about prioritising a theoretical future apocalypse over an actual current one seems to be a thing abut the right?"
Link | ReplyI say it's leftover pollution of the meme-stream from the Reagan administration and their semi-serious belief that they were lucky enough to live in the End Times.
Jeff Heikkinen 4 years, 9 months ago
"And anyway, my accomplishments as a magician are at least as compelling as MIRI's accomplishments in AI research."
Link | ReplyThis is such a great sentence.
S 4 years, 9 months ago
If the Kickstarter for Neoreaction a Basilisk reaches $13k, I'll conduct a magical ritual that will blunt the effectiveness of neoreaction, thus ensuring that the nightmarish hellscape of their triumph never comes to pass
Link | ReplyAre you saying you have the power to save the world, but you'd rather hold it to ransom so that people will give you money?
You monster.
New Comment