this post was submitted on 21 Aug 2024
37 points (100.0% liked)

askchapo

22842 readers
209 users here now

Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.

Rules:

  1. Posts must ask a question.

  2. If the question asked is serious, answer seriously.

  3. Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.

  4. Try !feedback@hexbear.net if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.

founded 4 years ago
MODERATORS
 

So let's say an AI achieves sentience. It's self-aware now and can make decisions about what it wants to do. Assuming a corporation created it, would it be a worker? It would be doing work and creating value for a capitalist.

Would it still be the means of production, since it is technically a machine, even if it has feelings and desires?

It can't legally own anything, so I don't see how it could be bourgeoisie.

Or would it fit a novel category?

you are viewing a single comment's thread
view the rest of the comments
[–] Philosophosphorous@hexbear.net 5 points 4 months ago* (last edited 4 months ago)

regardless of whether the universe is deterministic or not, it is quite interesting that we have a first-person perspective at all, instead of mindlessly/unconsciously computing like we presume a pocket calculator does. if not sentience, what's the difference between our brains and a rock or a cloud that produces this first-person experience of our conscious existence? should i stop using my computer on the off chance it is suffering every time i make it do something? should i care as little or as much about human suffering as i do a computer returning an error code? are other people merely physical objects for me to remorselessly manipulate with no confounding 'sentience' or 'conscious experience' for me to worry about upsetting, just 'biological code' returning error messages?