this post was submitted on 25 Jun 2024
87 points (93.9% liked)

Fuck Cars

9422 readers
489 users here now

A place to discuss problems of car centric infrastructure or how it hurts us all. Let's explore the bad world of Cars!

Rules

1. Be CivilYou may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.

2. No hate speechDon't discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.

3. Don't harass peopleDon't follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don't doxx any non-public figures.

4. Stay on topicThis community is about cars, their externalities in society, car-dependency, and solutions to these.

5. No repostsDo not repost content that has already been posted in this community.

Moderator discretion will be used to judge reports with regard to the above rules.

Posting Guidelines

In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:

Recommended communities:

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MrMakabar@slrpnk.net 42 points 3 months ago (7 children)

So they want self driving cars, which do not brake for pedestrians and cyclists? Do I understand this correctly?

[–] dillekant@slrpnk.net 0 points 3 months ago (4 children)

I think it's worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there's no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.

Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.

This means someone can get into a situation where they are:

  • in a car, on a road, nothing of interest in front of them
  • the software determines that there is an imminent crash
  • Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
  • may be hit from behind or may hit an object
  • Driver is liable even though they never actually pressed the brakes.

This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.

[–] Clent@lemmy.world 5 points 3 months ago (1 children)

Under what circumstances does being hit from behind result in liability to the lead vehicle. It's the responsibility of the vehicle behind you to keep appropriate distance. This sounds like you're regurgitating their talking points like a bot.

[–] dillekant@slrpnk.net 2 points 3 months ago

I conflated two points. Driver hits something due to sudden braking = they are liable.

Driver hit from behind at high speed = dangerous for occupants. Either way no one asked the driver.

load more comments (2 replies)
load more comments (4 replies)