this post was submitted on 09 May 2025
514 points (97.4% liked)
Technology
69913 readers
3046 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I get what you mean but it's still stuck at level 2 and it always will be. No matter how good it is, if you move your eyes from the road, it will eventually kill you. Cameras alone are not sufficient enough for autonomous driving.
I disagree with this assertion, because they’re correct that the only being that can currently drive is relying on vision. Vision alone is sufficient for driving.
But autonomous driving really hasn’t succeeded yet. We still have no idea what is required for autonomous driving or whether we can do it at all, regardless of sensors.
So you’re implying that we can definitely do autonomous driving but can’t do it the way humans do, whereas I say we won’t know the requirements until we find some that succeed, and we may never
Yeah sure. If you want the same bad results as humans deliver, in terms of crash rates, than it's possible. I wouldn't trust it. Also human vision and processing is completely different from computer vision and processing.
Presumably we have the intelligence to set requirements before something can be called self-driving - that’s usually what the fuss is about, whether the marketing is claiming it’s something it’s not.
If they fail with their approach, I’m fine with that, just like I’m fine if Waymo fails with their approach. Of either succeeds, why should I care how? Obviously there’s a problem if it runs over some old lady at a stop sign and drags them down the street but that’s clearly a failure for them
We already have that https://www.sae.org/blog/sae-j3016-update
Yes, we have the definitions, but I haven’t read about whether they’re effectively required. Is there a test, a certification authority, rules for liability or revocation? Have we established a way to actually require it.
I hope we wouldn’t let manufacturers self-certify, although historical data is important evidence. I hope we don’t aid profitability of manufacturers by either limiting liability or creating a path to justice doomed to fail
This stuff is highly regulated https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars
Mercedes has the first autonomous car (L3) you can buy, which you can only activate at low speeds on certain roads in Germany. It's only possible because of Lidar sensors and when activated you are legally allowed to look at your phone as long as you can take over in 10 or so seconds.
You aren't allowed to do this in a Tesla, since the government doesn't categorize Teslas as autonomous vehicles which requires L3+.
No car manufacturer can sell real autonomous vehicles without government approval. Tesla FSD is just Marketing bs. Others are far ahead in terms of autonomous driving tech.
The thing is humans are horrible drivers, costing a huge toll in lives and property every year.
We may already be at the point where we need to deal with the ethics of inadequate self-driving causing too many accidents vs human causing more. We can clearly see the shortcomings of all self driving technology so far, but is it ethical to block Immature technology if it does overall save lives?
Maybe it’s the trolley problem. Should we take the branch that leads to deaths or the branch that leads to more deaths
True
Are you talking about waymo vs human driver? It's currently (and maybe never) economical to roll that out globally. That would cost trillions and probably wouldn't even be feasible everywhere.
Teslas aren't autonomous but just mere driving assistants so you can't compare them. Otherwise you'd also have to include the Mercedeses (which btw have the first commercial Level 3 car), BMWs, BYDs, ...
It would be very unethical to allow companies to profit from dangerous and unsafe technology that kills people.
No manufacturer does good self-driving yet.
Several manufacturers including Tesla make driver assistants more reliable than humans in at least some cases, possibly most of the time.
It’s easy to say you don’t want to allow companies to profit from unsafe technology that kills people but what is the other choice? If you send the trolley down the other track, you’re choosing different deaths at the hands of unsafe humans. We will soon be at the point, or already are, that your choice kills more people. Is that really such an easy choice?
Where do you get that? From Elon?
Yes safety features and driving assistants make driving safer. Letting the car drive by itself not (especially with Teslas).