Tesla Vision fails as owners complain of Model 3 cameras fogging up in cold weather::A number of Tesla owners have taken to Reddit after their front cameras fogged up and stopped working in cold weather, leaving several features, including the US$10,000 FSD Beta, inoperable. Tesla has declined to assist to these customers, despite many of their vehicles being covered under warranty.

  • NataliePortland@lemmy.ca
    link
    fedilink
    English
    arrow-up
    91
    arrow-down
    4
    ·
    8 months ago

    Every other car uses LIDAR and Elon thinks he’s such a forward thinker for shunning it. So dumb

    • eskimofry@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      8 months ago

      Think of it this way: He is such a visionary that fog is blinding his vehicles.

    • StenSaksTapir@feddit.dk
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      2
      ·
      edit-2
      8 months ago

      Radar. Only a small handful of cars have LIDAR. But your point still stands. Outside of Elon being a humongous douche and completely unpredictable, the lack of sensors is the major reason for not wanting a Tesla.

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      10
      ·
      edit-2
      8 months ago

      The driving assist features of my Honda CR-V also stop working whenever there’s snow or ice on the front of the car. Bad design for cold climates is not just a Tesla issue.

      • Nudding@lemmy.world
        link
        fedilink
        English
        arrow-up
        34
        arrow-down
        7
        ·
        8 months ago

        You shouldn’t be out of park while there’s snow or ice on your vehicle. Clean off your fucking vehicles.

        • floofloof@lemmy.ca
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          1
          ·
          8 months ago

          Of course, and I do. I’m talking about the snow or ice that accumulates around the front bumper and grille while driving down the highway. At some point the car will notify me that driver assistance features are no longer available due to the sensor being obstructed.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          8 months ago

          I mean the stuff builds up on the front of your car while driving. Moisture sticks to the front and then refreezes.

        • Aleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          3
          ·
          edit-2
          8 months ago

          As yet another guy who is sick and tired of clumps of snow flying off of people’s vehicles onto my windshield, thank you for your service.

    • /home/pineapplelover@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      50
      ·
      8 months ago

      His argument makes sense. Human vision is not too different from just a camera. I see the argument for lidar but it can also be a bit more expensive to accomplish the same task. I’m open to listening to your argument as to why lidar technology would be a better path for self driving cars.

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        8 months ago

        It seems to me that we want to make self-driving cars safer that human drivers. And to make them safer, you want them to use every kind of sensor that is practical to avoid accidents. LIDAR alone isn’t the path. Neither is visual alone.

        Also, suggesting that a car with cameras is equivalent to a human with a human brain that has eyes attached to it is a little silly.

      • nephs@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        2
        ·
        8 months ago

        The eye is the fucking whole argument for the stupid creationism. The most complex piece of machinery in the human body and shit.

        That man thinks he’s god, to create similar functionality.

        Has he fucking tried to keep his eyes open in fucking cold weather?

        Why not just use humans eyes outside of earth’s atmosphere?!

        He’s just so fucking stupid. Rich and stupid. The shit he spends his “hard earned” money would be so much better and efficient if spent controlled by mostly anyone else.

      • accideath@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        2
        ·
        8 months ago

        Humans don’t just use eyes when driving. Sound, and touch also play big roles, for example when it comes hearing ambulances nearing or to feel road conditions. And we have a really good sense for depth and distance, that’s much harder to replicate with just cameras. And even Humans aren’t allowed to drive with headphones on (at least here in Germany), because it’s dangerous to limit the amount of sensors available to us.

        Besides that, even our sight is faaaar from perfect either and there are quite a lot of accidents caused by drivers just not seeing another driver or some other obstacle. Our vision is pretty good, yes, but the amount of guessing our brain has to do for us to actually see what we do isn’t exactly small.

        I don’t know about you, but I would prefer a self driving vehicle to be safer than a human. Because if it isn’t, why bother? And how could it be safer, if it uses less information than humans, who are shit drivers already?

        And yes, lidar is more expensive but so what? It’s cheap enough to add it to phones. Expensive phones, yes but in the grand scheme of things, they’re still quite a bit cheaper than a car and Teslas aren’t exactly cheap cars either. And Tesla used to include radar in their cars until they didn’t. And the cars didn’t get that much cheaper…

        And to give a positive example: Mercedes Benz are the first to launch a Level 3 autonomous vehicle. And guess what? It uses Lidar, audio sensors, road condition sensors, etc. and actually achieved L3 autonomy, while Tesla’s FSD is constantly tested to be one if the worst performing Level 2 systems in the industry, despite their claims of greatness…

        • Socsa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          8 months ago

          Lidar is not just more of expensive, it is extremely fragile in a vehicle which is bouncing around at highway speeds.

          • accideath@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            8 months ago

            Well, doesn’t seem to bother any other car manufacturer much. Probably because the benefits outweigh the complexity disadvantages

              • accideath@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                8 months ago

                Mercedes-Benz is. And others will certainly follow since Mercedes-Benz are the first to reach L3 autonomy

      • aesthelete@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        3
        ·
        8 months ago

        Human vision is not too different from just a camera.

        Oh yeah, human vision also causes people to mistake a blue truck for the sky and drive right into it. /s

          • aesthelete@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            8 months ago

            Sure but usually because they weren’t looking or couldn’t see it…not because they mistook a truck for the sky or some of the other dumb shit computer vision algorithms do.

              • aesthelete@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                edit-2
                8 months ago

                Not seeing something and mistaking something for another thing are pretty different problems. One can be corrected with glasses while correcting the other requires a brain transplant (or a brain in the first place).

                Edit: or, ya know adding another sensor would work and make it so the vision system wouldn’t have to be so good at object recognition and could just not hit things…but we can’t add the couple hundred dollars worth of parts for that.

      • loutr@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        8 months ago

        The obvious argument is that eyes are far from perfect and fail us all the time, especially when going fast. We are quite good at making up for it, but saying “We have eyes so my self driving cars will have eyes too” is pretty fucking dumb.

        • ItsMeSpez@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 months ago

          We also recognized that we need to keep our windshields clear of fog in order for our eyes to work properly.

      • GoodEye8@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        8 months ago

        That argument doesn’t make sense because human vision isn’t that great either. When it’s dark or raining or snowing or foggy our vision is pretty shit.

        I’m not saying LIDAR is better but rather point out that actually you want different types of sensors to accurately assess the traffic, because just one type of sensor isn’t likely to cut it. If you look at other manufacturers they’re not using only LIDAR or only camera. Some use LIDAR + camera, some user RADAR + camera, some user LIDAR, RADAR and camera. And I’m pretty sure that as manufacturers will aim for higher SAE levels they will add even more sensor into their cars. It’s only Tesla who thinks they can somehow do more with less.

        • /home/pineapplelover@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          I think it’s undeniable the combination of camera and lidar will be the best solution. I just hope this can be coss effective. Maybe over time we can be able to adapt and improve the technology and make it more economical so that it is safer for our roads.

        • Socsa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          8 months ago

          People here have no idea what they are talking about, or how absurdly difficult it is to actually deploy lidar to a consumer vehicle. There’s a reason why Tesla is shipping more advanced driver assist tech than anyone else, and it’s because they went against the venture capitalist Lidar obsession which is holding everyone back. There’s a reason why there are basically zero cars shipping with lidar today.

          You don’t need mm depth maps to do self driving. Not that you get that from lidar on rough roads anyway.

          • /home/pineapplelover@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            There are some test cars with lidar. It has the spinny thing on top and looks pretty interesting. I believe those cars are pretty successful. I don’t think they’re being mass produced though, because the costs might be a little prohibitive.

          • learningduck@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            8 months ago

            The most advanced that’s not even on autonomous level 3. It’s funny that Mercedes is the first to get level 3 approval in California and they don’t even boasting that as much.

            That aside, a secondary sensor that help verifying if the vision get it right would be nice. It could be just a radar or whatever. Imagine if the vision fail to recognize a boy in a Halloween costume as a person, at least the secondary sensor will the car to stop due to contradict perception.

            • GoodEye8@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              I might be misremembering but I think Teslas are actually more capable, they’re just deliberately stating they’re SAE level 2 so they could skirt the law and play loose and dangerous with their public beta test.

              • learningduck@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                I haven’t researched this enough, but Tesla says that they are level 3, but never bother to get the actual approval is like how I kept saying that I’m smart, but too lazy back in my school years.

                Put your money where your mouth is. Life are at stake here.

      • learningduck@programming.dev
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        8 months ago

        Think of that Coyote and the roadrunner cartoon. If there’s a graffiti that looks like a tunnel the coyote may run into the tunnel based on vision alone, but a secondary sensor will help telling that there’s a wall.

        Irl, If the vision failed to recognize that there’s something on the road, at least a secondary sensor will protest that there’s something on the road.

        • HERRAX@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          edit-2
          8 months ago

          You can also test driving in direct sunlight without sunglasses or the suncover. You get notifications and beeping noises whenever the sun hits them directly, making the lane assist (I refuse to call it autopilot) quite irrational in most weather… It’s actually worse for me than driving in cold weather.

      • poopkins@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        9
        ·
        8 months ago

        While I disagree with you that you think his argument makes sense, I’m upvoting your comment because it encourages discourse and provides more insight and depth to this topic. I wish more people on Lemmy did the same.