this post was submitted on 14 Aug 2023
504 points (96.7% liked)

Technology

59440 readers
3638 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[–] CaptainProton@lemmy.world 74 points 1 year ago* (last edited 1 year ago) (5 children)

This is stupid. Teslas can park themselves, they're not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver's fault and they should be held responsible for their actions. It's not the courts job to legislate.

It's actually the NTSB's job to regulate car safety so if they don't already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

[–] socsa@lemmy.ml 10 points 1 year ago (2 children)

There's no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I've experienced it first hand.

[–] doggle@lemmy.world 4 points 1 year ago

The headline doesn't state that the warnings were consecutive.

Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

I'll grant you, though, 150 warnings is still a little tough tough to believe...

[–] lapommedeterre@lemmy.world 4 points 1 year ago

Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that's in the article). It shows a good bit of the footage, too.

Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.

[–] chris2112@lemmy.world 6 points 1 year ago

The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called "self driving". Compare Tesla's marketing to like GMs of Hyundai's, both of which essentially have parity with Teslas system in terms of actual features, and you'll see a big difference

[–] doggle@lemmy.world 3 points 1 year ago

Sounds like the injured officers are suing. It's a civil case not criminal, so I'm not sure how much the court would actually be asked to legislate. I'd be interested to hear their arguments, though I'm sure part of their reasoning for suing Tesla over the driver is they have more money.

[–] dzire187@feddit.de 3 points 1 year ago (1 children)

It should be pulling over and putting the flashers on if a driver is unresponsive.

Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

[–] CmdrShepard@lemmy.one 5 points 1 year ago* (last edited 1 year ago)

You say that yet a Tesla did exactly that, which caused some tailgaters to crash into the back of it, and everyone blamed the Tesla for causing an accident.

https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/