If someone concludes that it’s fine for them to eat cows because cows never invented computers, then there’s not really much of an argument against aliens turning us into meat slurry because they have the 4-chambered quantum brains needed to invent hyperdrive and we don’t, is there?

  • BeamBrain [he/him]@hexbear.netOPM
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 months ago

    Thanks for reminding me that I still have your trilogy sitting in my Amazon library unread angery (in my defense, it’s largely because graeber had to come first)

    “a perfect copy of someone is literally the same person, not just a new person that is pre-loaded with someone else’s memories”

    I’m inclined to say no, but I’ll admit that’s almost entirely because of Roko’s Basilisk

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      I’m inclined to say no, but I’ll admit that’s almost entirely because of Roko’s Basilisk

      It may cheer you up, or maybe even make you laugh a little, if you consider that the billionaire-worshipping cultists that conjured up that pile of brainworms didn’t think it was scary enough for the bazinga god to torture a perfect copy of you (that is totally literally you, plucked from the past and back from the dead, so they believe), but they need to multiply the scary by saying it’s millions or billions of perfect copies of you being tortured by an especially petty god of creepy nerds. So if the “perfect” copy is you, so they believe, what do millions or billions of that perfect copy have to do with it? Is there some space magic pain multiplier? Is the experience telepathically multitasked as some kind of massive ass-pull?

      The simpler answer that cuts that bazinga-Gordian knot is “the bazinga god conjured up a new person, or billions of new persons, all with copied memories, and is being a cruel asshole to them because a long-gone person made it upset by not worshipping it fervently enough.” Sure, it’s a fine premise for a petty and cruel machine god (that “LessWrong” cultists think is an ideal (final) solution to all of their problems) in a farcically over the top grimdark edgelord way, but the premise is so silly that it deserves only mockery.

      • BeamBrain [he/him]@hexbear.netOPM
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        3 months ago

        It’s especially funny to me because it betrays a lack of understanding of how computers (and physics) even work. Thinking you can build a perfect copy of a person from previous records of their existence is like thinking you can with absolute certainty 100% accurately reconstruct a 1000x1000 image from a 100x100 resized copy of it. Sure, you can make an educated guess what was in those missing 990,000 pixels, even make a pretty good extrapolation, but since there are a lot more ways to combine pixels into a 1000x1000 grid than there are to combine them in a 100x100 grid, then by definition the smaller image must map to multiple possible larger ones - and a .png is a hell of a lot less complex than a human being. Entropy says no.

        • UlyssesT [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 months ago

          I don’t even go into the “it’s fake” argument which tends to go into very dehumanizing ideas, like people now that have prosthetic limbs are somehow “less human” and the like.

          That said, I also argue that “that perfect copy is literally the original and if you disagree you are a mystic hippie that believes in souls and fairy dust” arguments also dehumanize people by way of suggesting that your existence, right now, is that disposably interchangable and a widget made at any point in the future (or a billion widgets) not only replace you but also effectively render your original self invalid in favor of the new “originals.” I feel like it’s primarily death-cheating cope fantasies from people that are (understandably, I admit) afraid of death and want to believe that pressing a button will bring their subjectively experienced current selves back from the dead.

          I mean that’s exactly what happened with Raymond Kurzweil. His dad died, he really didn’t handle grieving that well, and wouldn’t settle for anything less than Singularity™ computer magic undoing that loss. It’s a grand heaven promise, a religious one, but with the trappings of secular scientism all over it.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      Oh, I hope you didn’t click the spoiler up there then, if you plan to read those books later. It’s quite a major plot point.