this post was submitted on 09 May 2025
516 points (97.4% liked)
Technology
69913 readers
3788 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
We already have that https://www.sae.org/blog/sae-j3016-update
Yes, we have the definitions, but I haven’t read about whether they’re effectively required. Is there a test, a certification authority, rules for liability or revocation? Have we established a way to actually require it.
I hope we wouldn’t let manufacturers self-certify, although historical data is important evidence. I hope we don’t aid profitability of manufacturers by either limiting liability or creating a path to justice doomed to fail
This stuff is highly regulated https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars
Mercedes has the first autonomous car (L3) you can buy, which you can only activate at low speeds on certain roads in Germany. It's only possible because of Lidar sensors and when activated you are legally allowed to look at your phone as long as you can take over in 10 or so seconds.
You aren't allowed to do this in a Tesla, since the government doesn't categorize Teslas as autonomous vehicles which requires L3+.
No car manufacturer can sell real autonomous vehicles without government approval. Tesla FSD is just Marketing bs. Others are far ahead in terms of autonomous driving tech.
The thing is humans are horrible drivers, costing a huge toll in lives and property every year.
We may already be at the point where we need to deal with the ethics of inadequate self-driving causing too many accidents vs human causing more. We can clearly see the shortcomings of all self driving technology so far, but is it ethical to block Immature technology if it does overall save lives?
Maybe it’s the trolley problem. Should we take the branch that leads to deaths or the branch that leads to more deaths
True
Are you talking about waymo vs human driver? It's currently (and maybe never) economical to roll that out globally. That would cost trillions and probably wouldn't even be feasible everywhere.
Teslas aren't autonomous but just mere driving assistants so you can't compare them. Otherwise you'd also have to include the Mercedeses (which btw have the first commercial Level 3 car), BMWs, BYDs, ...
It would be very unethical to allow companies to profit from dangerous and unsafe technology that kills people.
No manufacturer does good self-driving yet.
Several manufacturers including Tesla make driver assistants more reliable than humans in at least some cases, possibly most of the time.
It’s easy to say you don’t want to allow companies to profit from unsafe technology that kills people but what is the other choice? If you send the trolley down the other track, you’re choosing different deaths at the hands of unsafe humans. We will soon be at the point, or already are, that your choice kills more people. Is that really such an easy choice?
Where do you get that? From Elon?
Yes safety features and driving assistants make driving safer. Letting the car drive by itself not (especially with Teslas).