this post was submitted on 03 Sep 2023
209 points (100.0% liked)

Technology

37727 readers
635 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TheHalc@sopuli.xyz 2 points 1 year ago (2 children)

take responsibility [... like] human drivers do.

But do they really? If so, why's there the saying "if you want to murder someone, do it in a car"?

I do think self-driving cars should be held to a higher standard than humans, but I believe the fundamental disagreement is in precisely how much higher.

While zero incidents is naturally what they should be aiming for, it's more of a goal for continuous improvement, like it is for air travel.

What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?

[–] rikudou@lemmings.world 3 points 1 year ago

Well, the laws for sure aren't perfect, but people are responsible for the accidents they cause. Obviously there are plenty of exceptions, like rich people, but if we're talking about the ideal real-life scenario, there are consequences for causing an accident. Whether those consequences are appropriate or not is for another discussion.

[–] abhibeckert@beehaw.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

While zero incidents is naturally what they should be aiming for, it’s more of a goal for continuous improvement, like it is for air travel.

As far as I know, proper self driving (not "autopilot") AVs are pretty close to zero incidents if you only count crashes where they are at fault.

When another car runs a red light and smashes into the side of an autonomous vehicle at 40mph... it wasn't the AV's fault. Those crashes should not be counted and as far as I know they currently are in most stats.

What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?

I'm fine with exactly the same liability as human drivers have. Unlike humans, who are motivated to drive dangerously for fun or get home when they're high on drugs or continue driving through the night without sleep to avoid paying for a hotel, autonomous vehicles have zero motivation to take risks.

In the absence of that motivation, the simple fact that insurance against accidents is expensive is more than enough to encourage these companies to continue to invest in making their cars safer. Because the safer the cars, the lower their insurance premiums will be.

Globally insurance against car accidents is approaching half a trillion dollars per year and increasing over time. With money like that on the line, why not spend a lazy hundred billion dollars or so on better safety? It won't actually cost anything - it will save money.

[–] jarfil@beehaw.org 1 points 1 year ago

the safer the cars, the lower their insurance premiums will be.

Globally insurance against car accidents is approaching half a trillion dollars per year

That... almost makes it sound like the main opposition to autonomous cars, would be insurance companies: can't earn more by raising the premiums, if there are no accidents and a competing insurance company can offer a much cheaper insurance.