Tesla recalls nearly all vehicles sold in US to fix system that monitors drivers using Autopilot::Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system when using Autopilot.

  • DreadPotato@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    24
    ·
    edit-2
    11 months ago

    using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

    Idiot drivers do idiot things, hardly unique to Tesla drivers. The whole autopilot feature-set is pretty clearly marked as beta on the screen with a clear warning, that you have to acknowledge to enable, that it is considered a beta-feature and that you should be attentive when using it.

    I agree that the FSD feature-set is advertised with capabilities it in no way possesses. But everyone seems to forget that autopilot and FSD are two separate things. Autopilot is only TACC+Lane-Assist and isn’t advertised as a fully self driving technology.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      11 months ago

      But everyone seems to forget that autopilot and FSD are two separate things.

      Because Tesla made it as confusing as possible on purpose and misleads consumers…

      Which is why they’re getting sued about it as I type this…

      • DreadPotato@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        9
        ·
        11 months ago

        I’m not sure why people have issues differentiating FSD and Autopilot honestly, they have separate names that are not similar sounding.

        I 100% agree that FSD has been falsely advertised for almost a decade now though.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          11 months ago

          It’s because they think they’re experts after superficially reading through some news headlines and don’t actually do any research nor use either of these systems to gain the appropriate knowledge. They’re just playing a game of telephone with what they think they know.

    • Gork@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      The system should nevertheless be designed to handle all types of road and weather conditions. It’s a safety-related system. To not do so, regardless of the reason (probably cost savings) is negligence on the part of Tesla.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        11 months ago

        Autopilot isn’t any more dangerous than any other vehicle sold with cruise control over the past 30 years. I don’t understand why people are so desperate to give reckless drivers a pass rather than making them face consequences for their actions. Is it Honda’s fault if I hold my foot on the gas and drive a 1995 Civic through a red light and T-bone someone?

      • DreadPotato@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        7
        ·
        edit-2
        11 months ago

        It handles the same roads and conditions as pretty much all other manufacturers do with their TACC+Lane-Assist solutions. It maintains distance to vehicles ahead, and keeps the car centered in the lane on straight road and gentle curves…nothing more.

        The issue is people using this simple ADAS way outside it’s constraints, and wanting it to be more than it is, and is advertised as. Autopilot is not a self driving solution, and isn’t advertised as it either. It has the same limitations as other ADAS solutions in other cars, but apparently because Tesla calls their solution “Autopilot” people completely disregard warnings and limitations of the system and insist on using it as a self driving solution.

        • Fermion@mander.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          11 months ago

          Sounds like Tesla should market it as adaptive cruise and lane assist, since clearly their clientel think autopilot means autonomous driving.

          We’re getting to the point where these features might need to be locked behind a specific driving license designation. Drivers need to demonstrate to a proctor that they understand where the systems do and do not work.

          We already have license classifications for commercial equipment and motorcycles, having one for automated features seems fairly justifiable at this point.

    • AdamEatsAss@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      You would think a road with few drivers would be easier for the autopilot? But maybe the road lacked lines and marking? But wouldn’t you want the car to default to human control not just keep going? Any car I’ve had with lane keep assist turns off of it can’t find the lines. It’s a pretty simple failsafe. Rather have the driver a little annoyed than injured.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          And the driver in the above case was holding his foot on the accelerator to override AP and stated “I expect to be the driver and be responsible for this… I was highly aware that was still my responsibility to operate the vehicle safely."

        • AdamEatsAss@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Exactly. Not sure if there’s any regulation for this but from a controls standpoint you always want to make sure you fail to a “safe” state. I’d the system can’t find any of the input it needs the outcome will be unpredictable.