this post was submitted on 10 Jun 2023
17 points (100.0% liked)

Technology

37727 readers
562 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

More or less Tesla's autopilot is not as safe as Tesla would have you believe.

you are viewing a single comment's thread
view the rest of the comments
[–] Wiitigo@lemmy.world 11 points 1 year ago (2 children)

Still almost exactly half the crash rate of human-only drivers. Therefore, we should ban human-only driving.

[–] darkmugglet@lemm.ee 6 points 1 year ago (4 children)

You're missing the point -- with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of "who is at fault" get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.

I don't think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.

[–] Locrin@lemmy.world 4 points 1 year ago (1 children)

In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.

Sit behind the wheel, you are responsible for what happens.

[–] JillyB@beehaw.org 1 points 1 year ago (1 children)

I don't think this is a practical take. If I'm driving a car, I'm in control and know my intentions. If I'm responsible for an accident, it's because I wasn't fully alert or did something stupid.

If autopilot is driving the car, I don't know the car's intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn't there, it might swerve or brake and I won't recognize until it already happened. That's considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver's workload. It does that by requiring less concentration. I think it's inherently dangerous to require human intervention in autopilot systems.

[–] Locrin@lemmy.world 1 points 1 year ago

When using adaptive cruise control you can set the speed limit to let's say 60. If you drive behind someone and they have slowed down to 30 to take a steep turn they might disappear from your cars sensors. In that case the car might see no obstacle and rapidly accelerate trying to get back to 80. That is scary, because suddenly the car is accelerating towards a sharp turn. This is not theoretical, my friends Volvo has done this multiple times.

If your argument is safety it is moot. Autopilot has less accidents than humans.

Autopilot is just a more advanced version of this. It is brilliant as long as you know it's quirks. For highway driving with few cars around you can probably relax as as much or more as you would just cruising. For city driving you should be alert to take over at any time, but you might not have to navigate that complex intersection and can pay more attention to your surroundings.

Unless they get to a point where you can fold in the steering wheel and just be a passenger the burden falls on the driver.

[–] Fubarberry@aiparadise.moe 1 points 1 year ago

I'm all for more accountability, but it's still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.

[–] Wiitigo@lemmy.world 1 points 1 year ago

I was commenting on the original post, which was an assertion that "Autopilot was not as safe as Tesla would have you believe.".

I think you hopped topics all-together. And I actually agree with you.

[–] Faceman2K23@discuss.tchncs.de 1 points 1 year ago

My main issue with Teslas autopilot is it's branding and the way they advertise it.

Almost every non-tech person I talk to about things like that think it is 100% a hands off robot driver and that is a very, VERY dangerous idea.

It's a very good system, and it is improving with every update, but it is far from the idea that many people have in their heads.

The videos you see of people sleeping on autopilot are worrying, do Teslas not have driver alert monitoring? if I look away from the road for 5 seconds in my Mazda it lets me know very loudly that it wants me to pay attention, if I were to fall asleep it would do it's best to wake me up. when I use it's very simple and limited self driving function I cant take my hands off the wheel for more than about 10 seconds before it alerts me.

[–] RandomBit@sh.itjust.works 6 points 1 year ago (1 children)

I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.

[–] Kepler@lemmy.world 1 points 1 year ago (1 children)

If all instances of human intervention were included, I doubt Autopilot would be ahead.

Why would you interpret non-crashes due to human intervention as crashes? If you're doing that for autopilot non-crashes you've gotta be consistent and also do that for non-autopilot non-crashes, which is basically...all of them.

[–] RandomBit@sh.itjust.works 3 points 1 year ago

If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.

As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.