this post was submitted on 14 Aug 2023
198 points (96.3% liked)
Technology
59179 readers
3264 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not a fan of Tesla or Musk, but I think it always bears repeating in these conversations that AI driving will be much safer than human driving if it isn't already.
Unfortunately, accidents will happen, but when an accident happens with an AI, ALL the other AI's get to learn from that failure going forward.
I'm very happy that in my old age, I'll have some future version of this driving me around... or more likely, taking the wheel from me if I do something stupid.
AI driving is only as good as it's sensors.
While most other companies use LIDAR, Musk switched to video cameras because it's cheaper.
Which is why Tesla "FSD" is worse than competitors.
This isn't necessarily accurate. More sensors means more raw data that needs to be parsed and computed and you can run into issues where the two systems don't agree and the computer won't know what to do. Additionally, things like rain and snow can confuse LIDAR.
It may very well be that LIDAR is a required component for autonomous driving, but no companies have a fully functional system yet, so none of us can do more than speculate on what sensors are necessary.
Computers don't require two systems to agree. They just need good algorithms to analyse the data from both sensors.
Human body has sight and hearing sensors. Sometimes our sensors disagree (lightning vs thunder from distance), but we have the algorithm to analyse the input and come up with the correct conclusions.
With good enough algorithms, you don't even need two systems. Humans can drive perfectly fine off vision alone.
Autopilot is terrible and the fact that they advertise it as a reputable system is abhorrent. And yes, I own a tesla.
I'm pretty happy with autopilot in our cars, especially on road trips. It really helps with driving fatigue.
a straight flat interstate with well painted lines in clear conditions is the only time that i trust it anymore. waaaaaay too many close calls in every other situation.
The FSD stack understands the road MUCH better than any other car I've used out there. But it's decision making can still be dumb when deciding which lane to be in.
"This thing that does not exist and nobody has any idea how to make it" will totally be safer than human driving.
You know what is safer than human driving and we know how to make? Trains.
I'm still waiting for my train from LA to SF. It's been in the works since I was in college. I've already graduated, had multiple jobs, early retired, and there's still no sign of it.
I'm pretty sure people get hit by trains on a daily basis.
I believe statistically FSD is already better driver than a human. Ofcourse there are situations that confuse the AI and it makes errors a human wouldn't but this kind of stuff gets slowly ironed out over time. People also seem to forget that human drivers do pretty fucking stupid mistakes too. Enough so that 40000 of them die every ear in US alone. 100% safety is probably impossible to achieve and 99.99% safety means 33000 accidents per year.
It's easy to pick on Tesla due to the CEO being quite unpopular so every time a Tesla does something it's not supposed to it gets so wide media attention that it seems way more common that it really is.
Nevertheless self driving cars are here to stay and there will be time when wanting to drive by yourself will be considered irresponsible and unsafe. And I say this as someone with zero interest in owning such car.
AI driving will probably be safer one day, but there is no real data today that demonstrates its current state is. At the same time we're getting lots of examples where it fails at the most basic stuff.