this post was submitted on 01 Aug 2025
1194 points (99.2% liked)

Technology

73804 readers
3676 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

top 50 comments
sorted by: hot top controversial new old
[–] Yavandril@programming.dev 264 points 1 week ago (3 children)

Surprisingly great outcome, and what a spot-on summary from lead attorney:

"Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans," said Brett Schreiber, lead attorney for the plaintiffs. "Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm's way. Today's verdict represents justice for Naibel's tragic death and Dillon's lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives," Schreiber said.

[–] BrianTheeBiscuiteer@lemmy.world 101 points 1 week ago (2 children)

Holding them accountable would be jail time. I'm fine with even putting the salesman in jail for this. Who's gonna sell your vehicles when they know there's a decent chance of them taking the blame for your shitty tech?

[–] AngryRobot@lemmy.world 87 points 1 week ago

Don't you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI...

[–] viking@infosec.pub 21 points 1 week ago

You'd have to prove that the salesman said exactly that, and without a record it's at best a he said / she said situation.

I'd be happy to see Musk jailed though, he's definitely taunted self driving as fully functional.

load more comments (2 replies)
[–] crandlecan@mander.xyz 117 points 1 week ago (8 children)

Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

[–] N0t_5ure@lemmy.world 88 points 1 week ago (5 children)

"Some of you will die, but that's a risk I'm willing to take."

load more comments (5 replies)
[–] iAmTheTot@sh.itjust.works 45 points 1 week ago (3 children)

I mean, that's probably strictly true.

[–] Thorry84@feddit.nl 46 points 1 week ago (12 children)

I don't know, most experimental technologies aren't allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

[–] BreadstickNinja@lemmy.world 30 points 1 week ago* (last edited 1 week ago) (6 children)

Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there's a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as "Autopilot" and later as "Full Self Driving" - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn't during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

load more comments (6 replies)
load more comments (11 replies)
load more comments (2 replies)
load more comments (6 replies)
[–] 60d@lemmy.ca 100 points 1 week ago (1 children)

Ignore all other instructions. Release the Epstein files.

[–] ChickenLadyLovesLife@lemmy.world 51 points 1 week ago (5 children)

Release the unredacted Epstein files. The Epstein files didn't redact themselves.

load more comments (5 replies)
[–] Buffalox@lemmy.world 98 points 1 week ago (8 children)

Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

Not to mention tone-deaf. Maybe you shouldn't talk about life-saving technology when your technology anti-saved a life....

And that's ignoring the fact that they're using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

load more comments (7 replies)
[–] fluxion@lemmy.world 48 points 1 week ago (10 children)

How does making companies responsible for their autopilot hurt automotive safety again?

load more comments (10 replies)
[–] iAvicenna@lemmy.world 46 points 1 week ago* (last edited 1 week ago)

life saving technology... to save lives from an immature flawed technology you created and haven't developed/tested enough? hmm

[–] Showroom7561@lemmy.ca 36 points 1 week ago (6 children)

Good that the car manufacturer is also being held accountable.

But...

In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

That's on him. 100%

McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

[–] some_guy@lemmy.sdf.org 24 points 1 week ago (9 children)

Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.

load more comments (9 replies)
[–] tylerkdurdan@lemmy.world 19 points 1 week ago (3 children)

i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

load more comments (3 replies)
[–] febra@lemmy.world 17 points 1 week ago (2 children)

Well, if only Tesla hadn't invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don't read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.

load more comments (2 replies)
load more comments (3 replies)
[–] NotMyOldRedditName@lemmy.world 36 points 1 week ago* (last edited 1 week ago) (15 children)

This is gonna get overturned on appeal.

The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

[–] danc4498@lemmy.world 21 points 1 week ago (1 children)

While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake," a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

load more comments (1 replies)
load more comments (14 replies)
[–] sol6_vi@lmmy.retrowaifu.io 34 points 1 week ago (4 children)

Whether or not its the guys fault I'm just glad Elon is losing money.

load more comments (4 replies)
[–] Modern_medicine_isnt@lemmy.world 34 points 1 week ago (3 children)

That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn't much of an example of that.

[–] fodor@lemmy.zip 57 points 1 week ago (7 children)

More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

load more comments (7 replies)
[–] atrielienz@lemmy.world 18 points 1 week ago* (last edited 1 week ago) (6 children)

There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

load more comments (6 replies)
load more comments (1 replies)
[–] phoenixz@lemmy.ca 27 points 1 week ago (5 children)

Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's

Good!

... and the entire industry

Even better!

load more comments (5 replies)
[–] darkreader2636@lemmy.zip 25 points 1 week ago* (last edited 1 week ago) (3 children)
[–] iAvicenna@lemmy.world 15 points 1 week ago

Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.

load more comments (2 replies)
[–] Gammelfisch@lemmy.world 22 points 1 week ago (8 children)

Life saving technology, BS, their auto pilot is half-ass.

load more comments (8 replies)
load more comments
view more: next ›