If someone concludes that it’s fine for them to eat cows because cows never invented computers, then there’s not really much of an argument against aliens turning us into meat slurry because they have the 4-chambered quantum brains needed to invent hyperdrive and we don’t, is there?

  • UlyssesT [he/him]@hexbear.net
    cake
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    3 months ago

    I HATE DARK FOREST THEORY

    I HATE DARK FOREST THEORY

    For real though, I agree with you: the kill-or-be-killed chuddery that infests most contemporary science fiction concepts of aliens is creatively poisonous to me.

    I think it’s also related to more recent fads of “the universe is empty of all sapient life except Humanity Fuck Yeah jackoffs that want resources” settings. Even the possibility of peacefully sharing the stars in science fiction with non-human entities has become repulsive, scary, or “woke” to typical contemporary sci-fi consumers.

      • UlyssesT [he/him]@hexbear.net
        cake
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        3 months ago

        I wholeheartedly agree with your takes in the OP in that link.

        While I tend to struggle-session with @Frank@hexbear.net from time to time, especially about “a perfect copy of someone is literally the same person, not just a new person that is pre-loaded with someone else’s memories” presumptions (that more often than not have a convenient “and the copying technology must kill the original person of course” rug-sweep as part of the argument so there’s no messy “I’m still standing here, yo” counterpoint to that supposed transferable originality) that go on and on a bit like this:

        I actually do completely agree with Frank that “cyberware takes away your essence/humanity/whateveris bullshit purity ideology. I went out of my way to emphasize in my own fiction that most working-class people already had cyberware (coercively, for productivity purposes, loaded with malware and trackers) and it didn’t make them “less human.”

        Also, in my own novel trilogy, (spoilers for anyone who might read my books in the future and doesn’t want the main plot twist spoiled in advance)

        spoiler

        a completely artificial and synthetically created but fully valid sapient being, burdened with the memories of her “donor” human to the point of confusing herself into thinking that she was that person for a while until she learned what had actually happened, becomes the primary protagonist eventually and fully embraces her true origins and becomes a synthesis of those injected memories and her own experiences and volition.

        There’s sometimes an unnecessary divide where some people seem to think “not the original person” means “not a valid person in their own right” and I think that’s bullshit. I’d even argue that in a society where fully artificial people are possible, those that are sapient without biological nervous systems, it’d be a sort of burden, even an act of cruelty, to forcibly inject memories and personalities into those persons against their will under whatever illusory pretense of “uploading” might be desired by biological beings.

        • Frank [he/him, he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          9
          ·
          3 months ago

          One of the few genuinely impressive things i encountered in the CP77 game. You get a side mission to rescue some monks who were kidnapped to be forcefully implanted with 'ware against their will. You end up rescuing them.

          Later you can encounter them again. When you ask them if they think Johnny, or any other engram, is a real person one of them says that if the engram is able to suffer then it in alive and has a soul.

          • UlyssesT [he/him]@hexbear.net
            cake
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            3 months ago

            One of the few genuinely impressive things i encountered in the CP77 game. You get a side mission to rescue some monks who were kidnapped to be forcefully implanted with 'ware against their will. You end up rescuing them.

            Very critical support of CDPR’s Cyberpunkerinos for having that moment featured instead of the bullshit parade that Deus Ex’s sequels eventually became. Human Revolution sort of addressed coercive pressure on sex workers and the economic pressures of having cyberware and maintaining it in a human body, but that was basically dropped later for “what if the new racism… is against poor persecuted rich assholes with superpowers, except Adam Jensen because that’d be a bummer to play through that?” centrist

            Later you can encounter them again. When you ask them if they think Johnny, or any other engram, is a real person one of them says that if the engram is able to suffer then it in alive and has a soul.

            For all our disagreements in the past, I also agree: an artificial person of sufficient sapience is a person and deserves rights.

        • BeamBrain [he/him]@hexbear.netOPM
          link
          fedilink
          English
          arrow-up
          7
          ·
          3 months ago

          Thanks for reminding me that I still have your trilogy sitting in my Amazon library unread angery (in my defense, it’s largely because graeber had to come first)

          “a perfect copy of someone is literally the same person, not just a new person that is pre-loaded with someone else’s memories”

          I’m inclined to say no, but I’ll admit that’s almost entirely because of Roko’s Basilisk

          • UlyssesT [he/him]@hexbear.net
            cake
            link
            fedilink
            English
            arrow-up
            8
            ·
            3 months ago

            I’m inclined to say no, but I’ll admit that’s almost entirely because of Roko’s Basilisk

            It may cheer you up, or maybe even make you laugh a little, if you consider that the billionaire-worshipping cultists that conjured up that pile of brainworms didn’t think it was scary enough for the bazinga god to torture a perfect copy of you (that is totally literally you, plucked from the past and back from the dead, so they believe), but they need to multiply the scary by saying it’s millions or billions of perfect copies of you being tortured by an especially petty god of creepy nerds. So if the “perfect” copy is you, so they believe, what do millions or billions of that perfect copy have to do with it? Is there some space magic pain multiplier? Is the experience telepathically multitasked as some kind of massive ass-pull?

            The simpler answer that cuts that bazinga-Gordian knot is “the bazinga god conjured up a new person, or billions of new persons, all with copied memories, and is being a cruel asshole to them because a long-gone person made it upset by not worshipping it fervently enough.” Sure, it’s a fine premise for a petty and cruel machine god (that “LessWrong” cultists think is an ideal (final) solution to all of their problems) in a farcically over the top grimdark edgelord way, but the premise is so silly that it deserves only mockery.

            • BeamBrain [he/him]@hexbear.netOPM
              link
              fedilink
              English
              arrow-up
              7
              ·
              edit-2
              3 months ago

              It’s especially funny to me because it betrays a lack of understanding of how computers (and physics) even work. Thinking you can build a perfect copy of a person from previous records of their existence is like thinking you can with absolute certainty 100% accurately reconstruct a 1000x1000 image from a 100x100 resized copy of it. Sure, you can make an educated guess what was in those missing 990,000 pixels, even make a pretty good extrapolation, but since there are a lot more ways to combine pixels into a 1000x1000 grid than there are to combine them in a 100x100 grid, then by definition the smaller image must map to multiple possible larger ones - and a .png is a hell of a lot less complex than a human being. Entropy says no.

              • UlyssesT [he/him]@hexbear.net
                cake
                link
                fedilink
                English
                arrow-up
                5
                ·
                3 months ago

                I don’t even go into the “it’s fake” argument which tends to go into very dehumanizing ideas, like people now that have prosthetic limbs are somehow “less human” and the like.

                That said, I also argue that “that perfect copy is literally the original and if you disagree you are a mystic hippie that believes in souls and fairy dust” arguments also dehumanize people by way of suggesting that your existence, right now, is that disposably interchangable and a widget made at any point in the future (or a billion widgets) not only replace you but also effectively render your original self invalid in favor of the new “originals.” I feel like it’s primarily death-cheating cope fantasies from people that are (understandably, I admit) afraid of death and want to believe that pressing a button will bring their subjectively experienced current selves back from the dead.

                I mean that’s exactly what happened with Raymond Kurzweil. His dad died, he really didn’t handle grieving that well, and wouldn’t settle for anything less than Singularity™ computer magic undoing that loss. It’s a grand heaven promise, a religious one, but with the trappings of secular scientism all over it.

          • UlyssesT [he/him]@hexbear.net
            cake
            link
            fedilink
            English
            arrow-up
            5
            ·
            3 months ago

            Oh, I hope you didn’t click the spoiler up there then, if you plan to read those books later. It’s quite a major plot point.

        • TankieTanuki [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          some people seem to think “not the original person” means “not a valid person in their own right”

          Did you guys struggle sesh over Tuvix?

          • UlyssesT [he/him]@hexbear.net
            cake
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            3 months ago

            Not directly, no. Warlord Janeway cut that particular Gordian knot by killing the new person anyway because she wanted Tuvok back.

  • NaevaTheRat@vegantheoryclub.org
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 months ago

    There’s a fun computer puzzle game (Turing complete) where the premise is that an alien is testing if you can build a computer from simulated components. If you can’t, it means you’re ok to eat.

    I always felt like the person who made that was vegan lol.

    I think it’s all common root stuff though, a might makes right framework of assessing the world.

  • StalinStan [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    3 months ago

    Liberal assumption is that our world is the best possible one. So we have to assume that it is a war of all against all.

    However without the civilizing effects of capitlaism we would return to our wild nature and have a world where everyone is struggling against everyone else all the time.

    So aliens beinf more advanced would have to be more like us as we are perfect.

  • quarrk [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 months ago

    That’s a very intriguing thought. Humans do have a tendency to confront their morality only through mediated forms e.g. deities. It would make sense that the same applies to a subconscious guilt about carnism and colonialism.

    • UlyssesT [he/him]@hexbear.net
      cake
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      3 months ago

      I’ve seen many times, regarding carnism and numerous other treat defense impulses, that the angriest that the treat defender gets is at the moment that they might, just might, have felt a pang of guilt for whatever harm came from the cultivation, harvesting, production, distribution, or consumption of that treat.

      Main Character Syndrome, I think. “I can’t do bad things. I don’t feel like a bad guy. It is the person scolding me that is wrong.” galaxy-brain