Google’s latest flagship smartphone raises concerns about user privacy and security. It frequently transmits private user data to the tech giant before any app is installed. Moreover, the Cybernews research team has discovered that it potentially has remote management capabilities without user awareness or approval.

Cybernews researchers analyzed the new Pixel 9 Pro XL smartphone’s web traffic, focusing on what a new smartphone sends to Google.

“Every 15 minutes, Google Pixel 9 Pro XL sends a data packet to Google. The device shares location, email address, phone number, network status, and other telemetry. Even more concerning, the phone periodically attempts to download and run new code, potentially opening up security risks,” said Aras Nazarovas, a security researcher at Cybernews…

… “The amount of data transmitted and the potential for remote management casts doubt on who truly owns the device. Users may have paid for it, but the deep integration of surveillance systems in the ecosystem may leave users vulnerable to privacy violations,” Nazarovas said…

  • DavidGarcia
    link
    fedilink
    arrow-up
    19
    arrow-down
    7
    ·
    1 month ago

    It’s so ironic that Pixels are the go to devices for privacy roms these days.

    All this shit is probably happening at the hardware level too, with 100 different backdoors you can’t remove with your megamind plan of installing a custom rom.

    The silicon probably has the ability to live stream all sensor data directly to the NSA using the fanciest ML compression technology lmao.

    • ExtremeDullard@lemmy.sdf.org
      link
      fedilink
      arrow-up
      20
      arrow-down
      5
      ·
      1 month ago

      It’s so ironic that Pixels are the go to devices for privacy roms these days.

      It’s so ironic it’s a show-stopper for me. I’m not paying fucking Google to escape the Google dystopia. Nosiree! That’s just too rich for me.

      This is why I own a Fairphone running CalyxOS. Yes, I know GrapheneOS is supposedly more secure - I say supposedly because I think 95% of users don’t have a threat model that justifies the extra security really. But I don’t care: my number one priority is not giving Google a single cent. If it means running a less secure OS, I’m fine with that.

      There’s no way on God’s green Earth I’m buying a Pixel phone to run a deGoogled OS. That’s such an insane proposition I don’t even know how anybody can twist their brain into believing this is a rational thing to do.

        • ExtremeDullard@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 month ago

          I’ve been arguing this many times with many people, and everybody seems to adopt their own way of interpreting things to suit their preferences.

          Here’s my line of thinking:

          • If the first buyer buys a Google cellphone new for, say, $500 (no idea of the price, just making it up for the sake of explaining), this buyer gives $500 to Google
          • If I then buy this cellphone second-hand for, say, $300, the original buyer gets $300 back, meaning Google now has $300 of my money.

          That’s a hard no.

          Of course, there’s the argument that Google got $500 no matter what and they don’t know who the money is from. But that’s besides the point: I know Google got my money. I most defintely parted with $300 to acquire a Google cellphome, meaning as far as I’m concerned, I indirectly gave Google $300 of my money. And I refuse to give Google any money, however indirect the transaction might be. The only way I could become the owner of a Google phone is if someone gave one to me, I found it in the trash or I stole it.

          There’s also the argument that if I don’t buy the cellphone, it might end up in a landfill, so if I’m environmentally-minded, I should save it from the landfill. That’s true, but my counter-argument to this is that a healthy second-hand market for Google phones gives them more value, therefore makes them more appealing to potential buyers and ultimately supports Google’s business.

          I don’t like serviceable stuff being landfilled for no good reason (otherwise I wouldn’t pay extra to buy a Fairphone) but in the case of Google hardware, I reckon it should end up at the landfill as often as possible to diminish its value and hurt Google. Of course, I’m only one meaningless guy, but I reckon boycotting Google is a moral duty for anybody who’s concerned about privacy and civil liberties.

          And of course, I don’t want a Google product in my pocket because it would make me nauseous. But that’s entirely subjective.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 month ago

        I say supposedly because I think 95% of users don’t have a threat model that justifies the extra security really.

        Does street cred with my Cybersecurity peers count as a threat model?

        I’m definitely one of the users of GrapheneOS that you’re talking about. My threat model is “this is fucking cool!”

        Also, the grass is always greener on the other side. I want a Fair phone.

    • smeg@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      1 month ago

      Citation needed. I get that it’s healthy not to trust anyone, but with the amount of security research that goes into these devices if something like that was happening then we would know about it.

        • smeg@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 month ago
          1. Applies to every phone, smart or simple, can be combatted with a £5 Faraday bag
          2. That is about monitoring by your network, nothing to do with the phone manufacturer really
          3. A ten year old article about Samsung phones
          4. An exploit affecting lots of phones that seems like it was fixed

          So a few interesting points, but nothing even slightly like what OP was suggesting.

          • refalo@programming.dev
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            1 month ago

            can be combatted with a £5 Faraday bag

            I don’t consider that a reasonable solution for most people, and there are many posts claiming those almost never work well enough. You could also make the argument that it shouldn’t be necessary in the first place.

            That is about monitoring by your network

            I don’t think it matters to most people, as you are still tracked by having the phone physically with you, which is what people are against.

            A ten year old article about Samsung phones

            Are you suggesting Samsung phones should have ever been allowed to spy on people? Or that this doesn’t highlight a bigger issue? I don’t see why this should get a pass at all.

            An exploit affecting lots of phones that seems like it was fixed

            I think it’s very much a real threat, and leaked docs show world governments and bad actors actively use such exploits routinely for years, including keeping previously unknown exploits a secret to use for themselves.

            I understand your desire to turn talking points into nothingburgers but I feel like this is not only disingenuous but against the entire principal of security and privacy. Of course we all have our own individual threat models, but to dismiss another person’s model because you think it shouldn’t matter to anyone, doesn’t seem like a good idea to me.

            • smeg@feddit.uk
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 month ago

              Look, I’m not trying to say there aren’t real security/privacy issues that aren’t being exploited right now, my citation needed was regarding this comment:

              The silicon probably has the ability to live stream all sensor data directly to the NSA using the fanciest ML compression technology lmao.

              The articles you linked are real issues that have been documented, OP was arguing that Google phones specifically are bad because of this statement they pulled out of their arse.

    • mctoasterson@reddthat.com
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      Maybe and maybe not. We need to encourage robust alternatives, unfortunately this requires a ton of capital to develop hardware and reserve fab time and get your devices fabricated instead of a major player like Google or Samsung.

      We basically need something in the smartphone space equivalent to the Framework laptop, that can meet the security hardware requirements, allow bootloader unlock/relock and support GrapheneOS and other custom ROMs.