It IS the same log

Skip to content

Elizabeth Sandifer

Elizabeth Sandifer created Eruditorum Press. She’s not really sure why she did that, and she apologizes for the inconvenience. She currently writes Last War in Albion, a history of the magical war between Alan Moore and Grant Morrison. She used to write TARDIS Eruditorum, a history of Britain told through the lens of a ropey sci-fi series. She also wrote Neoreaction a Basilisk, writes comics these days, and has ADHD so will probably just randomly write some other shit sooner or later. Support Elizabeth on Patreon.

10 Comments

  1. Carl Churchill
    May 14, 2016 @ 7:08 pm

    Why throw in that silly mosquito nets jab in the end?

    Reply

    • Elizabeth Sandifer
      May 14, 2016 @ 7:12 pm

      Really? That’s the jab that goes too far?

      Reply

    • Devin
      May 14, 2016 @ 7:23 pm

      Because the rest of the post was so unutterably serious and it needed something to lighten it up?

      Reply

    • David Gerard
      May 14, 2016 @ 10:47 pm

      That’s your “too far”?

      Reply

    • Tim B.
      May 15, 2016 @ 6:29 am

      Mosquito nets are a cheap and effective way of helping the 3.2bn people in the actual world at risk of actual malaria.

      Roko’s Basilisk is a SF concept that can be dismissed in so many many trivial ways it’s just embarrassing that apparently intelligent people are wasting their time & resources on an idea that they believe something far more intelligent than them will subsequently waste their</> time & resources on.

      Seriously I’ve just read the Rational Wiki page on Roko’s and quite frankly whilst I’m not big on offending idiots fuck those guys.

      Reply

      • Tim B.
        May 15, 2016 @ 8:57 am

        Doesn’t appear to be an edit/delete function (only the ‘their’ was supposed to be in italics for emphasis) but whilst I’m here I’ll add that what is it about prioritising a theoretical future apocalypse over an actual current one seems to be a thing abut the right? – gun control for example.

        And whilst Captcha confirms I’m not a robot, it doesn’t deal with the issue of whether I’m not a future AI simulation…

        Reply

        • David Gerard
          May 15, 2016 @ 11:01 am

          To be fair, Yudkowsky does not believe in or endorse the Basilisk idea.

          To also be fair, Yudkowsky does believe in and advocate almost all of its prerequisites.

          And to be absolutely clear, Yudkowsky has explicitly stated that giving him money for AI risk is more important than the mosquito nets.

          Reply

        • Aberrant Eyes
          May 15, 2016 @ 1:17 pm

          “what is it about prioritising a theoretical future apocalypse over an actual current one seems to be a thing abut the right?”

          I say it’s leftover pollution of the meme-stream from the Reagan administration and their semi-serious belief that they were lucky enough to live in the End Times.

          Reply

  2. Jeff Heikkinen
    May 15, 2016 @ 10:49 pm

    “And anyway, my accomplishments as a magician are at least as compelling as MIRI’s accomplishments in AI research.”

    This is such a great sentence.

    Reply

  3. S
    May 18, 2016 @ 8:58 am

    If the Kickstarter for Neoreaction a Basilisk reaches $13k, I’ll conduct a magical ritual that will blunt the effectiveness of neoreaction, thus ensuring that the nightmarish hellscape of their triumph never comes to pass

    Are you saying you have the power to save the world, but you’d rather hold it to ransom so that people will give you money?

    You monster.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Eruditorum Press

Subscribe now to keep reading and get access to the full archive.

Continue reading