this post was submitted on 27 Jul 2023
462 points (100.0% liked)

Technology

34920 readers
108 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

cross-posted from: https://derp.foo/post/81940

There is a discussion on Hacker News, but feel free to comment here as well.

you are viewing a single comment's thread
view the rest of the comments
[–] DauntingFlamingo@lemmy.ml 30 points 1 year ago* (last edited 1 year ago) (5 children)

The most basic driving like long stretches of highway shouldn't be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

Edit: I meant this statement as referring to the technology in it's current consumer form (what is available to the public right at this moment). I fully expect that as the technology matures so will the percentage of incidents decline. We are likely to attain a largely driverless society one day in my lifetime

[–] snooggums@kbin.social 23 points 1 year ago (2 children)

"Self driving with driver assist" or whatever they call it when it isn't 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are...

[–] GonzoVeritas@lemmy.world 24 points 1 year ago

Autopilot systems on airplanes make fewer claims about autonomous operation than Tesla. No pilot relies completely on autopilot functionality.

[–] amju_wolf@pawb.social 4 points 1 year ago (1 children)

Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it's not even really all that autonomous - at most it follows a pre-programmed route.

[–] 4am@lemmy.world 4 points 1 year ago (2 children)

Can’t the newer ones take off and land as well?

[–] Bene7rddso@feddit.de 2 points 1 year ago

Yes, but the pilot still needs to pay attention and be ready to intervene

[–] amju_wolf@pawb.social 2 points 1 year ago

They can, but the setup is still non-trivial and full auto landing capability isn't used all that much even if technically available. It also isn't just the capability of the aircraft, it requires a shitton of supporting infrastructure on the ground (airport) and many airports don't support this.

That would be equivalent to installing new intersections where you'd also have a broadcast of what the current signals are for each lane, which would help self-driving cars immensely (and regular cars eventually too, with assistive technologies to help drivers drive more safe), but that's simply not a thing yet.

[–] amanneedsamaid@sopuli.xyz 2 points 1 year ago

I disagree, I feel no matter how good the technology becomes, the odd one-in-a-million glitch that kills someone is not preferable to me over the accidents caused by humans. (Even if we assume the self driving cars crash at a lesser rate than human drivers).

The less augmentation past lane assist and automated braking the better IMO. I definitely disagree with a capped speed limit built into the vehicle, that should never be limited less than what could melt engine components or something (and even that would be take time to turn on). The detriments that system would cause when it malfunctions far outweigh the benefits it would bring to safety.

[–] dudewitbow@lemmy.ml 2 points 1 year ago

Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

[–] dan1101@lemmy.world 2 points 1 year ago (2 children)

Long stretches of highway are good unless there is a stopped emergency vehicle.

[–] amju_wolf@pawb.social 5 points 1 year ago

I mean that's a huge issue for human drivers too.

We need assistive technologies that protect us, but if at any point the driver is no longer driving the car manufacturer needs to take full responsibility.

[–] DauntingFlamingo@lemmy.ml 2 points 1 year ago* (last edited 1 year ago)

That would be the augmented part and the AI. ANYTHING that presents a potential hazard already takes a vehicle out of automated driving in most models, because after a few Teslas didn't stop people started suing