this post was submitted on 02 May 2025
1086 points (98.7% liked)

memes

14538 readers
3729 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] randoot@lemmy.world 289 points 2 days ago (3 children)

Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

[–] DarrinBrunner@lemmy.world 97 points 2 days ago (2 children)

I didn't know this, but I'm not shocked, or even a little bit surprised.

[–] Sonicdemon86@lemmy.world 47 points 2 days ago (1 children)

Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

[–] randoot@lemmy.world 41 points 2 days ago (1 children)

This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was "off"

load more comments (1 replies)
load more comments (1 replies)
[–] HonoraryMancunian@lemmy.world 27 points 2 days ago* (last edited 2 days ago) (1 children)

Holy shit I did indeed look it up, and it's true. Dunno if it'll hold up but it's still shady as shit

[–] JcbAzPx@lemmy.world 13 points 2 days ago (3 children)

Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you're not controlling it at that moment, you are expected to maintain safe operation.

That's why the Uber self driving car that killed someone was considered the test driver's fault and left Uber mostly off the hook.

Not sure how it works for the robo taxis, though.

[–] Allonzee@lemmy.world 6 points 2 days ago

Yeah that's gonna be tricky with those. I live in Vegas where they're already operating. No steering wheel at all.

load more comments (2 replies)
[–] Tja@programming.dev 4 points 1 day ago

The driver is always at blame, even if it was on. They turn it off for marketing claims.

PS: fuck elon

[–] SabinStargem 10 points 1 day ago

The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.

/s.

[–] ZILtoid1991@lemmy.world 56 points 2 days ago (1 children)

Except the autopilot will modify its data that it was turned off right at the moment it hits people...

[–] NikkiDimes@lemmy.world 44 points 2 days ago (2 children)

Nah, it just disengages a fraction of a second before impact so they can claim "it wasn't engaged at the moment of impact, so not our responsibility."

There were rumours about this for ages, but I honestly didn't fully buy it until I saw it in Mark Rober's vison vs lidar video and various other follow-ups to it.

[–] Tja@programming.dev 8 points 2 days ago

It not about responsibility, it's about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called "full self driving"). It's just a lane keeping assistant.

If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.

[–] xeekei@lemm.ee 8 points 2 days ago (2 children)
[–] NotMyOldRedditName@lemmy.world 12 points 2 days ago* (last edited 2 days ago) (4 children)

It turns off, but it's likely so the AEB system can kick in.

AP and AEB are separate things.

Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.

Rules are changing to lessen when it needs to be reported, so things like fender benders aren't necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.

load more comments (4 replies)
load more comments (1 replies)
[–] sexy_peach@feddit.org 134 points 2 days ago

Autopilot will turn off a few milliseconds before impact either way

[–] guywithoutaname@lemm.ee 40 points 2 days ago (2 children)

I'd imagine you are always responsible for what you do when you're driving, even if a system like autopilot is helping you drive.

[–] LemmyFeed@lemmy.dbzer0.com 35 points 2 days ago (1 children)

Especially cause autopilot disengages right before the accident so it's technically always your fault.

[–] arin@lemmy.world 6 points 2 days ago

Yup gotta read the fine print

[–] opus86 5 points 1 day ago

If you are in the drivers seat, you are responsible for anything the car does unless there was a provable mechanical failure.

[–] supersquirrel@sopuli.xyz 91 points 2 days ago* (last edited 2 days ago) (6 children)

Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.

The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.

Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.

[–] SkavarSharraddas@gehirneimer.de 30 points 2 days ago (1 children)

Which is why we need laws about human responsibility for decisions made by AI (or software in general).

load more comments (1 replies)
[–] The_Caretaker@lemm.ee 10 points 2 days ago (1 children)

The economic function of AI is to abdicate responsibility for your actions

Cars already do that without AI. If someone driving a car kills you and they aren't drunk, they probably won't get in any trouble and the car manufacturers never face any penalty for 40,000 deaths and 2,000,000 injuries per year they cause in the USA alone.

load more comments (1 replies)
load more comments (4 replies)
[–] ZkhqrD5o@lemmy.world 38 points 2 days ago* (last edited 2 days ago) (21 children)

Tldr: Take the train and be safe.

Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(

Edit: Also, Musk is a Nazi cunt who should die of cancer.

[–] bleistift2@sopuli.xyz 12 points 2 days ago (1 children)

Speaking as a German: There are fewer train-related deaths because the trains don’t drive.

[–] ZkhqrD5o@lemmy.world 6 points 2 days ago

Well, we can thank Mr. Schröder for that. "Der Genosse der Bosse"

[–] ArtemisimetrA@lemm.ee 6 points 2 days ago

We have Vision Zero in the US, too. They lowered speed limits in a couple neighborhoods from 25mph to 20, and all the LED road signs show annual aggregated deaths from car crashes until the number is greater than zero, then someone wrings their hands and says "Welp, we did what we could, guess people just like dying" and then goes on vacation. (Source: me, I made up the spokesperson who gets scapegoated, but all the other stuff is observationally evident where I live)

[–] Tja@programming.dev 2 points 1 day ago

Well, there is no train station at my house. Or Aldi. Or my kids Kindergarten. And I live Germany, where public transport is excellent on a global level (memes about Deutsche Bahn aside).

Cars will be necessary for the foreseeable future. Let's make them as safe as possible while investing in public transport, they are not mutually exclusive.

PS: fuck Elon.

load more comments (18 replies)
[–] Reygle@lemmy.world 39 points 2 days ago (3 children)

Made by someone who's able to THINK like a Tesla owner.

Brake pedal? Unthinkable

[–] Demdaru@lemmy.world 10 points 2 days ago (1 children)
load more comments (1 replies)
load more comments (2 replies)
[–] SaharaMaleikuhm@feddit.org 22 points 2 days ago (3 children)

In my country it's always your fault. And I'm very glad.

load more comments (3 replies)
[–] spankmonkey@lemmy.world 40 points 2 days ago (1 children)

At best Tesla pays a fine, not Elon.

[–] koper@feddit.nl 24 points 2 days ago

Don't worry, DOGE will just fire the investigators before that happens.

[–] PennyRoyal@sh.itjust.works 24 points 2 days ago (1 children)

Wow. That’s a staggeringly apt update

[–] DannyBoy@sh.itjust.works 40 points 2 days ago (2 children)
[–] PennyRoyal@sh.itjust.works 16 points 2 days ago (1 children)
load more comments (1 replies)
[–] AeonFelis@lemmy.world 8 points 2 days ago (1 children)

Oh. A fine. How will Musk survive that financially?

load more comments (1 replies)
[–] throwawayacc0430@sh.itjust.works 9 points 2 days ago (1 children)

Jump out of the car.

"I'm not driving, I'm travelling" 🤓

load more comments (1 replies)
[–] HurlingDurling@lemm.ee 9 points 2 days ago (2 children)
[–] mosiacmango@lemm.ee 8 points 2 days ago (4 children)

Cant Kobayashi maru the trolley problem. There is no choice but the choices presented.

load more comments (4 replies)
load more comments (1 replies)
load more comments
view more: next ›