- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.
Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.
The ruling, reported by Reuters on Wednesday, clears the way for a lawsuit over a fatal crash in 2019 north of Miami involving a Tesla Model 3. The vehicle crashed into an 18-wheeler truck that had turned on to the road into the path of driver Stephen Banner, shearing off the Tesla’s roof and killing Banner.
The concept of autonomous cars might be game over.
As always, advocates forgot about corporate greed. Do you trust your manufacturer to not lie to you? So much you risk killing yourself, your family and people on the road?
Yeah, the scary part of this is that as much as I absolutely would never go near this shit with a ten foot pole when it’s clearly still woefully inadequate and over hyped… They very frequently drive withing ten feet of me because for some reason it’s legal to put this shit on roads with unwilling participants.
I get it there’s inevitable interference of interest here but we can’t really tell other people to not do things we don’t like in a free country
Edit: this is clearly being misinterpreted. I am NOT talking about the Tesla. I’m saying a hypothetical, well-regulated self-driving car can be fielded without the permission of every other motorist that thinks they’re icky.
Yes, we can tell people they can’t do things. Welcome to society we’ve all been talking and decided on a bunch of things people can’t do in a free country. It’s public roads, it’s entirely reasonable to have restrictions on self driving cars, just like you can’t ride a tandem bicycle in the HOV lane.
People get fined for having unsafe vehicles on public roads all the time. All that’s needed here is a regulatory body to decide self-driving cars are unsafe enough to revoke approval.
Oh hell yeah if it’s unsafe. I’m making the finer point that saying “you don’t have the right to drive that car next to me cuz it makes me feel weird” is overstepping
I’m pretty sure the actual concern has less to do with “feeling weird” and more “because it and/or its inattentive driver may suddenly kill me” because of a dysfunctional self-driving system whose capabilities has been fraudulently marketed and has, in reality, repeatedly, killed people.
They said “for some reason it’s legal to push [self driving cars in general] on unwilling participants”. That’s what I’m addressing
Yes, clearly you have missed something.
That’s why I go 100km/h in school zones. Free country!
Yeah but that is beyond what anyone would consider reasonable
Are you telling me not to do it?
I’m saying that you don’t need everyone else’s permission to drive a safety regulated self-driving car. That’s it. I’m not talking about the Tesla
deleted by creator
Irony is, most likely escaping you…
Dude. You’re clearly not understanding the nuance of my point
lol the no step on snek guy is complaining about nuance while misquoting everyone.
Ok but I’m a democratic socialist… should be a red flag about the assumptions you’re making
truth in advertising laws exist for a reason
also the people who frequently talk about a “free country” are often the same ones that want more police so they can do taliban style gender policing so it (the expression) seems deeply inauthentic at this point.
Now if only they were enforced.
True about “free country” being used to justify a society controlled by extreme wealth. And I’m talking about another persons right to “drive” a self-driving car next to me. Not about these guys objectively being criminally ass-hole-y
What do you mean when you say this?
I am referring to America which prides itself on freedom (and not enough on equality and collectivism) . I’m just saying it makes legal sense that you don’t need the consent of every other motorist to operate a self-driving car (if it passes safety regulations and assuming no problems of regulatory capture). Both of those assumptions are not applicable here
deleted by creator
Couldn’t agree more
deleted by creator
Autonomous vehicles ten years ago: Human drivers are slow and prone to lapses in judgement
Autonomous vehicles today: Elon musk, a guy who famously destroyed a rare vehicle like a dumbass, will be training the AI that drives you around. It won’t know how to respond to an event not encountered in the training data and it will occasionally run into an ambulance
And his employees hate him so much I wouldn’t be surprised if there’s a patch released that makes one sustained fart noise when airbags deploy.
Maybe at least until there’s a better comprehensive infrastructure of external sensors on the road, at intersections, etc etc etc, to control and limit vehicle movement, but that probably will be a long while before getting those improvements considering normal routine road and bridge maintenance is far behind as it is.
To me, autonomous vehicles are like AI (it actually is AI in the case of Tesla): the public perception is that it’s way better than it really is because it’s really good in 80% of cases. But to get to 90-95% will take many many years still. That doesn’t mean we shouldn’t use them, neither abandon them. To progress, we have to keep using them with caution. Learn the limits and work within it. Don’t start firing people to be replaced with AI because in a few months and years you’ll realize that the 20% left to improve will be hurting more than your thought. The same way you shouldn’t remove drivers just yet.
But it’s not true AI. In my decades of experience driving cars I’ve encountered numerous edge cases that I never explicitly learned about during my drivers ed days. One recent case in point - I pulled up to a red light at a fairly busy intersection and stopped. While the light was still red a police officer on the corner at a construction site walked out and tried to wave me through the intersection. I was watching the red light so I didn’t even see him until he yelled at me.
How would an autonomous AI car handle that situation if it’s not explicitly trained to recognize it? It would need to recognize the police officer as an authority that legitimately overrides the red light.
Same intersection a few years earlier I saw a car engulfed in flames right in the middle of it. I saw & heard the fire trucks rapidly approaching as I got to the intersection. I, and others, realized we needed to get out of the way quickly. Would a Tesla AI(or any other) recognize the car is on fire and safely move away, or would it just recognize the shape of the car and patiently wait for it to move out of the intersection before proceeding?
The point is that it’s virtually impossible to predict for, and program an AI to handle, every single situation it might ever encounter. A true AI would be trained on a lot of these sorts of scenarios but would need to be capable of recognizing edge cases it hasn’t encountered before as well. It would then need to react as safely as possible to those edge cases in a manner similar to how a human would.
Edit: Downvotes must be from Tesla fanbois who can’t face reality. If the had legitimate arguments they would have replied…
This is why AI is a solution, not coding everything. How does one learn how to react in these situations? Either you’ve learned from watching your parents, by taking lessons, reading the code or by simply following the others. The goal of an AI is to be able to do just that. Coding every single use case is way too complex.
I know Tesla has worked on improving emergency vehicles situations, but I don’t know how and what’s the current state.
Why are you being downvoted?
I think you need protected ways where no people or non autonomous vehicles may enter. Shy of that, I think you’re right.
You’ve just described a train line sir, and we need way more of that shit around here
Yes it’s called the entire public road system. Alpha test your product on your own roads!
That’s the point. The road system has cyclists and pedestrians that these cars like to kill.
Yes but when you said sounded like we needed to make protected ways. It should be them making protected ways.
I’m saying that this is stupid technology that will only work if you separate it from the public.
Hell yeah, let these drivers behind the wheel plow into more semi trucks. They deserve it after all. /s
The Wright brothers first flight was less than the wingspan of a Boeing 747, an aircraft with a rang of over 8,000 miles. The Internet was once called a fad.
Autonomous cars will be the future and people will die before they become the defacto method of personal transport. The unwilling sacrifices of a public alpha test of the technology are worth the losses we must endure to achieve the unparalleled safety of ubiquitous autonomous vehicles that mitigate traffic congestion, pedestrian deaths, unwieldy public transit, and the shortcomings of urban sprawl.
The deaths caused by early adoption benefit the greater good and we should be willing to accept their loss as a necessary evil for a greater good.
Not that I would ever trust a computer to drive my car. I will drive my own car until it kills me, financially or literally, but I can see what good an imperfect system struggling with growing pains will create.