This machine mildly irritates fascists

Skip to content

Elizabeth Sandifer

Elizabeth Sandifer created Eruditorum Press. She’s not really sure why she did that, and she apologizes for the inconvenience. She currently writes Last War in Albion, a history of the magical war between Alan Moore and Grant Morrison. She used to write TARDIS Eruditorum, a history of Britain told through the lens of a ropey sci-fi series. She also wrote Neoreaction a Basilisk, writes comics these days, and has ADHD so will probably just randomly write some other shit sooner or later. Support Elizabeth on Patreon.

12 Comments

  1. Froborr
    April 26, 2016 @ 9:54 am

    It occurs to me that there’s a certain amount of similarity between the logic of the Great Filter and the stabbed-in-the-back myth of white supremacy. In both cases, there is an expectation that what the speaker thinks of as their kind should absolutely dominate the universe, and in response to the realization that they don’t, the speaker turns to conspiracy theories rather than question the assumptions that led to the original expectation. The Great Filter is, essentially, a stabbed-in-the-back myth for sapience-supremacists.

    Reply

    • Elizabeth Sandifer
      April 26, 2016 @ 2:18 pm

      The Great Filter doesn’t imply dominance; just detectability.

      Reply

      • Froborr
        April 26, 2016 @ 3:56 pm

        Well, but they both start with “my mode of being is the bestest mode.” This then confronts the SETI proponent with the obvious question, “So why hasn’t it evolved everywhere, such that the sky is full of ETIs talking to each other?” while the white supremacist is confronted with the question, “So how come your life sucks?” Both then resort to conspiracy theories for answers, rather than considering the possibility that their mode of being isn’t as obviously superior as they think it is.

        Reply

        • Elizabeth Sandifer
          April 26, 2016 @ 4:25 pm

          I don’t think the value judgment is an inherent part of the Great Filter model, unless you want to treat “apparently sole existent” as a synonym for “best.”

          Reply

        • Devin
          April 28, 2016 @ 9:38 am

          Right. I mean, maybe Land or Roko would apply that value judgement, but there are definitely people who buy into the Great Filter* who are really seriously hoping that the answer to “what is the filter” turns out to be “there’s lots of civilizations out there, they’re just weirder than you’ve ever imagined” or “sapient, technical civilization sucks and most folks who try it out end up ditching the gig and going back to the trees” or the like, because that’s a much better answer (in terms of our descendants’ chances for happiness) than “sapience turns out not to be much good for keeping the peace but very good for making bigger and nastier weapons,” or “sapients who invent radio make themselves big ol’ targets for ancient autonomous war machines and even as we speak a Berserker is en route to chuck a Kuiper Belt object down our throats at 0.1c” or whatever.

          *Not to imply any particular credulity on their part. I think it’s a useful analytical framework.

          Reply

        • nydwracu
          May 1, 2016 @ 7:05 pm

          The Great Filter is trivially true — it’s a question, not a hypothesis. If the only reason we don’t observe ETIs everywhere is that life needs vanishingly rare conditions in order to evolve, that’s the Great Filter.

          That said, there’s certainly something interesting going on with the dissonance between one’s own values and Gnon’s. It wouldn’t be hard to relate Gnon-horror to a movie like Idiocracy, or to the founding trauma of nerddom: realizing that the world doesn’t share your values and doesn’t reward people for living up to them.

          Once you realize that, what do you do? There’s something interesting going on there, with neoreaction — the usual answer is to try to build a community that does share your values and reward people for living up to them, but there’s no reason not to abandon having distinct values altogether and try to submit entirely to instrumental rationality.

          …and now I’m reminded of Pasolini in ’68, and of course the New Left today is the true nerddom…

          Reply

  2. Kit Power
    April 26, 2016 @ 10:26 am

    I am SO hungry for this book now. Take my damn money already.

    Reply

  3. Kit Power
    April 26, 2016 @ 10:27 am

    I am SO hungry for this book now. Take my damn money already.

    Reply

  4. Jake
    April 26, 2016 @ 11:03 am

    Can’t wait to contribute to this on Kickstarter

    Reply

  5. Asteele
    April 28, 2016 @ 6:57 am

    Until this moment I had forgotten about the goof ball Scott Alexander for 5 years.

    Reply

  6. Paul M. Cray
    April 28, 2016 @ 9:49 am

    Yudkowsky, in his precocious phase, is extensively discussed in the 1997 edition of Damien Broderick’s “The Spike” and he was prominent on the AGI email list in the early 2000s, so he was pretty well-known in Singularitarian circles a decade before “Overcoming Bias”, although he obviously found a larger and more appreciative audience there.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.