this post was submitted on 11 Feb 2025
56 points (100.0% liked)

technology

23635 readers
221 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
top 21 comments
sorted by: hot top controversial new old
[–] Dirt_Owl@hexbear.net 18 points 1 month ago* (last edited 1 month ago) (5 children)

Well at least they aren't strapping guns to them like the US is.

Still, is it strange that I don't like the idea of making a whole class of robots to do our dirty work? I know I'm probably just anthropomorphising, but it feels wrong.

[–] kristina@hexbear.net 30 points 1 month ago (1 children)

Better than some guys foot rotting off due to direct contact with sewer sludge

[–] Dirt_Owl@hexbear.net 8 points 1 month ago

That is true

[–] GaryLeChat@lemmygrad.ml 29 points 1 month ago (2 children)

They don't have feelings, they're machines

[–] Dirt_Owl@hexbear.net 13 points 1 month ago* (last edited 1 month ago) (2 children)

I know I know. But I keep thinking what if they do? We used to say the same thing about non-human animals. It would be a horrible mistake to make. Of course I never felt this way about toasters lol. It's weird how that works. Probably something about humans relating to things the more human-like they appear.

Probably watch too many movies too lol.

[–] BatsAreRats@hexbear.net 25 points 1 month ago (1 children)

Can confirm they do not having feelings and if they do then we've made a absolutely massive breakthrough in gen AI

[–] Dirt_Owl@hexbear.net 14 points 1 month ago (1 children)

Well that makes me feel better. I wish they'd stop calling it AI to be honest.

[–] BatsAreRats@hexbear.net 2 points 1 month ago

I really really agree lol, please just call it robotics lol, begging them

[–] blunder@hexbear.net 10 points 1 month ago (1 children)

I talk to my toaster and all other objects in my house. Sometimes I thank them for the work that they do. I want them to like me.

[–] dil@hexbear.net 3 points 1 month ago (1 children)

Reminds me of Steve the pencil from community: https://youtu.be/uAwSVOlOgH8

[–] HexReplyBot@hexbear.net 2 points 1 month ago

I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:

[–] Hexamerous@hexbear.net 4 points 1 month ago

And even if these robots were in any form "conscious" it would be on the level of an insect, probably a lot lower.

[–] Jabril@hexbear.net 12 points 1 month ago (1 children)

Workplace injuries and work related disability will plummet if we can replace laborious work with robots.

[–] Dirt_Owl@hexbear.net 8 points 1 month ago

Well put.

Just never give them the ability to suffer please, tech bros. I don't want to just create a new miserable underclass.

[–] Saeculum@hexbear.net 12 points 1 month ago (2 children)

If we had the ability to make a robot that had opinions about what work it did, we'd also have the ability to make it love that work beyond anything else.

[–] Dirt_Owl@hexbear.net 7 points 1 month ago (1 children)

This is reminding me of that part of Hitchhikers Guide where there is a talking cow that is bred to love being cooked

[–] Saeculum@hexbear.net 4 points 1 month ago

It's a fun and interesting ethical dilemma, and also very funny.

[–] Nacarbac@hexbear.net 4 points 1 month ago (1 children)

I don't think that actually follows. We'd certainly be in a position to practice and refine the process, but not necessarily guarantee that it's working until we give the (apologies for the Harry Potter reference, but I think it apt) Robot House Elf a pistol and turn around. Also, ethics.

Luckily the simple solution is to just not make a sapient slave race, robotic or otherwise. Sapience isn't necessary for an autonomous tool.

[–] Saeculum@hexbear.net 4 points 1 month ago

My point of view is that in humans and animals in general, emotions are largely a chemical response in the brain. We might not fully understand how those processes interact, but we do know that certain chemicals cause certain feelings, and that there is a mechanism in the brain governing emotion that is notionally separate from our ability for rational thought.

I am willing to concede that it might be possible for a sufficiently complex computer to accidentally or in a way not entirely within our understanding to develop the capacity for rational thought in a way that we would recognise as sapient, or at least animal level intelligence.

I am not willing to concede that such a computer could develop a capacity for what we recognise as emotion without it being intentionally designed in, and if it's designed we necessarily need to understand it. This happens in fiction a lot because it's more compelling to anthropomorphize AI characters, not because it's particularly plausible.

[–] m532@lemmygrad.ml 5 points 1 month ago

Now that I think about it, robots shouldn't resemble humans or animals, as they'd certainly be anthromorphized otherwise

[–] Hexamerous@hexbear.net 7 points 1 month ago

Okej the one with wheels was awesome. Feels like the obvious evolution of these "dogs".