A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.

The ruling, reported by Reuters on Wednesday, clears the way for a lawsuit over a fatal crash in 2019 north of Miami involving a Tesla Model 3. The vehicle crashed into an 18-wheeler truck that had turned on to the road into the path of driver Stephen Banner, shearing off the Tesla’s roof and killing Banner.

  • bedrooms@kbin.social
    link
    fedilink
    arrow-up
    47
    arrow-down
    5
    ·
    edit-2
    1 year ago

    The concept of autonomous cars might be game over.

    As always, advocates forgot about corporate greed. Do you trust your manufacturer to not lie to you? So much you risk killing yourself, your family and people on the road?

    • Ottomateeverything@lemmy.world
      link
      fedilink
      arrow-up
      38
      arrow-down
      1
      ·
      1 year ago

      Yeah, the scary part of this is that as much as I absolutely would never go near this shit with a ten foot pole when it’s clearly still woefully inadequate and over hyped… They very frequently drive withing ten feet of me because for some reason it’s legal to put this shit on roads with unwilling participants.

      • RedditRefugee69@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        37
        ·
        edit-2
        1 year ago

        I get it there’s inevitable interference of interest here but we can’t really tell other people to not do things we don’t like in a free country

        Edit: this is clearly being misinterpreted. I am NOT talking about the Tesla. I’m saying a hypothetical, well-regulated self-driving car can be fielded without the permission of every other motorist that thinks they’re icky.

        • jopepa@lemmy.world
          link
          fedilink
          arrow-up
          26
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Yes, we can tell people they can’t do things. Welcome to society we’ve all been talking and decided on a bunch of things people can’t do in a free country. It’s public roads, it’s entirely reasonable to have restrictions on self driving cars, just like you can’t ride a tandem bicycle in the HOV lane.

        • mateomaui@reddthat.com
          link
          fedilink
          English
          arrow-up
          20
          ·
          1 year ago

          People get fined for having unsafe vehicles on public roads all the time. All that’s needed here is a regulatory body to decide self-driving cars are unsafe enough to revoke approval.

          • RedditRefugee69@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            16
            ·
            1 year ago

            Oh hell yeah if it’s unsafe. I’m making the finer point that saying “you don’t have the right to drive that car next to me cuz it makes me feel weird” is overstepping

            • mateomaui@reddthat.com
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              1
              ·
              1 year ago

              I’m pretty sure the actual concern has less to do with “feeling weird” and more “because it and/or its inattentive driver may suddenly kill me” because of a dysfunctional self-driving system whose capabilities has been fraudulently marketed and has, in reality, repeatedly, killed people.

        • Fedizen@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          1 year ago

          truth in advertising laws exist for a reason

          also the people who frequently talk about a “free country” are often the same ones that want more police so they can do taliban style gender policing so it (the expression) seems deeply inauthentic at this point.

          • RedditRefugee69@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            True about “free country” being used to justify a society controlled by extreme wealth. And I’m talking about another persons right to “drive” a self-driving car next to me. Not about these guys objectively being criminally ass-hole-y

          • RedditRefugee69@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            6
            ·
            1 year ago

            I am referring to America which prides itself on freedom (and not enough on equality and collectivism) . I’m just saying it makes legal sense that you don’t need the consent of every other motorist to operate a self-driving car (if it passes safety regulations and assuming no problems of regulatory capture). Both of those assumptions are not applicable here

    • Fedizen@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 year ago

      Autonomous vehicles ten years ago: Human drivers are slow and prone to lapses in judgement

      Autonomous vehicles today: Elon musk, a guy who famously destroyed a rare vehicle like a dumbass, will be training the AI that drives you around. It won’t know how to respond to an event not encountered in the training data and it will occasionally run into an ambulance

      • jopepa@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        And his employees hate him so much I wouldn’t be surprised if there’s a patch released that makes one sustained fart noise when airbags deploy.

    • mateomaui@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Maybe at least until there’s a better comprehensive infrastructure of external sensors on the road, at intersections, etc etc etc, to control and limit vehicle movement, but that probably will be a long while before getting those improvements considering normal routine road and bridge maintenance is far behind as it is.

    • RushingSquirrel@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      1 year ago

      To me, autonomous vehicles are like AI (it actually is AI in the case of Tesla): the public perception is that it’s way better than it really is because it’s really good in 80% of cases. But to get to 90-95% will take many many years still. That doesn’t mean we shouldn’t use them, neither abandon them. To progress, we have to keep using them with caution. Learn the limits and work within it. Don’t start firing people to be replaced with AI because in a few months and years you’ll realize that the 20% left to improve will be hurting more than your thought. The same way you shouldn’t remove drivers just yet.

      • IphtashuFitz@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        4
        ·
        edit-2
        1 year ago

        But it’s not true AI. In my decades of experience driving cars I’ve encountered numerous edge cases that I never explicitly learned about during my drivers ed days. One recent case in point - I pulled up to a red light at a fairly busy intersection and stopped. While the light was still red a police officer on the corner at a construction site walked out and tried to wave me through the intersection. I was watching the red light so I didn’t even see him until he yelled at me.

        How would an autonomous AI car handle that situation if it’s not explicitly trained to recognize it? It would need to recognize the police officer as an authority that legitimately overrides the red light.

        Same intersection a few years earlier I saw a car engulfed in flames right in the middle of it. I saw & heard the fire trucks rapidly approaching as I got to the intersection. I, and others, realized we needed to get out of the way quickly. Would a Tesla AI(or any other) recognize the car is on fire and safely move away, or would it just recognize the shape of the car and patiently wait for it to move out of the intersection before proceeding?

        The point is that it’s virtually impossible to predict for, and program an AI to handle, every single situation it might ever encounter. A true AI would be trained on a lot of these sorts of scenarios but would need to be capable of recognizing edge cases it hasn’t encountered before as well. It would then need to react as safely as possible to those edge cases in a manner similar to how a human would.

        Edit: Downvotes must be from Tesla fanbois who can’t face reality. If the had legitimate arguments they would have replied…

        • RushingSquirrel@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          This is why AI is a solution, not coding everything. How does one learn how to react in these situations? Either you’ve learned from watching your parents, by taking lessons, reading the code or by simply following the others. The goal of an AI is to be able to do just that. Coding every single use case is way too complex.

          I know Tesla has worked on improving emergency vehicles situations, but I don’t know how and what’s the current state.

          Why are you being downvoted?

    • stolid_agnostic@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      I think you need protected ways where no people or non autonomous vehicles may enter. Shy of that, I think you’re right.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Hell yeah, let these drivers behind the wheel plow into more semi trucks. They deserve it after all. /s

    • Death_Equity@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The Wright brothers first flight was less than the wingspan of a Boeing 747, an aircraft with a rang of over 8,000 miles. The Internet was once called a fad.

      Autonomous cars will be the future and people will die before they become the defacto method of personal transport. The unwilling sacrifices of a public alpha test of the technology are worth the losses we must endure to achieve the unparalleled safety of ubiquitous autonomous vehicles that mitigate traffic congestion, pedestrian deaths, unwieldy public transit, and the shortcomings of urban sprawl.

      The deaths caused by early adoption benefit the greater good and we should be willing to accept their loss as a necessary evil for a greater good.

      Not that I would ever trust a computer to drive my car. I will drive my own car until it kills me, financially or literally, but I can see what good an imperfect system struggling with growing pains will create.