Self-driving tech is widely distrusted by the public, and Tesla’s huge Autopilot recall and Cruise’s scandals don’t seem to have helped.

  • pearsaltchocolatebar@discuss.online
    link
    fedilink
    arrow-up
    22
    arrow-down
    1
    ·
    5 months ago

    That’s because Teslas aren’t autonomous vehicles. They’re just calling lane assist features self-driving.

    A true self-driving vehicle wouldn’t need a human watching it.

    • Spiralvortexisalie@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      Even the more advanced cars, the so called level 3 cars that have come out in last year (Tesla I believe is still the lower rated level 2) do not do as much as people think. And even the Level 3 cars I have tried suffer from a serious drawback in that they essentially will ditch self-driving with little warning in populated areas. In the burbs and country with straight highways the vehicles can do alright and/or enough to convince people they are FSD. In city areas I have seen vehicles lose GPS-lock and unable to read the lane markings just kick out of self driving mode, which can happen semi-often around large trucks and heavy bridges, a life or death situation if you weren’t paying attention or stupid enough to sit in the back for a tiktok.

  • n3m37h@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    5 months ago

    Tesla’s make driving on the roads dangerous. Was driving on a highway (2 lane) passed a vehicle that was going a bit slow, and was going past a off ramp. Decided to merge back behind a car going approx the same speed as I was starting to merge back the bloody car saw a speed limit sign and started braking. I nearly rear ended the fucking Tesla. Got a video too.

    The first day I got my dash cam, was winter and roads had just been plowed, so there is a approx car width wide if salt/sand in the middle of the road. On my way to work was coming down a hill on my side of the road, coming up the hill, was a Tesla SUV driving all 4 wheels on the sand (aka on the yellow line)

    Both cars and drivers are brain dead

      • n3m37h@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        5
        ·
        5 months ago

        That’s not it m8, the average speed on the 401 is 120kph on slow days 130 typical and speed limit is 100. And suddenly having a vehicle drop from 130 to 100 because it saw a sign is fucking dangerous.

        The Tesla was doing about 125 or so when I started merging and I was 3 car lengths behind them, then randomly braked when there was nothing in front of em.

        There is nothing safe about abrupt actions which those POS do all the time

        • pearsaltchocolatebar@discuss.online
          link
          fedilink
          arrow-up
          5
          arrow-down
          4
          ·
          edit-2
          5 months ago

          If you’re close enough to the car in front of you that them slowing down by 25kph causes you to almost hit them, you’re too close for the speed you’re traveling.

          It doesn’t matter if there’s a clear reason for it or not, or if everyone else is breaking the law. You’re responsible for maintaining a safe distance, not the person you’re behind.

          3 car lengths is not a sufficient distance for 125kph.

          • n3m37h@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            4
            ·
            5 months ago

            Easy Karen I don’t need a driving lesson, I did not get into a fucking accident because I was paying attention to the road and traffic around me. Something the driver(s) of the Tesla was not doing because who the fuck slows down suddenly on a highway with no one in front of you?

            Seriously take the hard sandpapery object out of your ass and calm down

        • AreaKode@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          5 months ago

          And I would argue that you would see 10 humans do this same action in the same time you’d see a robot to it once. Humans are horribly inefficient drivers, and eventually, computers will be far superior. I hope to see much better advances in the coming years.

          They need to just… don’t call it “auto pilot” until it actually functions like auto pilot.

          • n3m37h@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            5 months ago

            Hahahahahahaha

            Tesla won’t because cameras alone won’t work. The people who have access to lidar and radar as well as mics aren’t anywhere close to universal safe autonomous driving (outside of a predefined/maped location).

            And machine learning is in its infancy right now, it is unpredictable, unreliable in a lot of applications. And don’t even get me started on this stuff operating in any other conditions than bright sunny days.

            I’ve used lane assist (Elantra, Tuscon) and I find it to be terrible in many cases, long turns being the worst.

            Autonomous vehicles in warehouses have just as many issues and nowhere near the variables that a car needs to account for…

            Please take your head outta your arse and come back to reality please

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    5 months ago

    Self driving tech is pretty good and getting better at an insane rate. I think people only distrust it because of bad media reporting.

    • atzanteol@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      5 months ago

      I don’t trust it because musk lies all the time. It may work fine, but you can’t tell lies like he does and expect people to believe you this time.

      • Fizz@lemmy.nz
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        Self driving tech isn’t only tesla. There are many implementations and they are pretty amazing in my opinion.

        • atzanteol@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          5 months ago

          Sure, but it’s impressive in the same way that a dancing bear is impressive - and it’s not because the bear dances well.

          Even the best self driving implementation are limited to warm sunny days in well mapped areas.

          • dvoraqs@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            5 months ago

            Actually, it can work pretty well. My Comma 3X could see and navigate the road better than I could in heavy rain on the highway. There’s many different levels of maturity here, but even lane keep assist makes driving easier and is useful for that.

            You’re still right to distrust these systems, but that doesn’t mean that they are bad.

            • atzanteol@sh.itjust.works
              link
              fedilink
              arrow-up
              2
              ·
              5 months ago

              Oh yeah, it can work great. And it can work terribly. We haven’t hit the point where it’s reliably “great” though. And that makes it rather more dangerous to me since it builds a sense of security that is unwarranted (not that I’m saying you disagree I’m just expanding on my distrust).

              One of the major problems is that the failure modes can be very different from how a person fails. Like when you see a car just sitting in the middle of a road because it can’t figure out what to do for some reason. A person you could wave on. An AI you can’t. We understand human behavior but can’t really understand the AI decision-making process.

              This is why I can’t quite get behind the “all AI needs to do is be slightly better than people” argument. On one hand, from a purely statistics pov, I get it I. But if self-driving cars were “basically perfect” except that every-now-and-then one of them randomly exploded (still killing fewer people than auto accidents) would people be okay with that? Automobile accidents aren’t truly “random” like that.