this post was submitted on 28 Feb 2025
110 points (94.4% liked)

Technology

63375 readers
4497 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kibiz0r@midwest.social 7 points 17 hours ago* (last edited 13 hours ago) (2 children)

Could just say:

If you accept either privacy of consciousness or phenomenal transparency then philosophical zombies must be conceivable and therefore physicalism is wrong and you can’t engineer consciousness by mimicking brain states.

Edit:

I guess I should've expected this, but I'm glad to see multiple people wanted to dive deep into this!

I don't have the expertise or time to truly do it justice myself, so if you want to go deep on this topic I'm going to recommend my favorite communicator on non-materialist (but also non-religious) takes on consciousness, Emerson Green:

But I'm trying to give a tl;dl to the first few replies at least.

[–] tabular@lemmy.world 3 points 16 hours ago (2 children)

Why does privacy of consciousness mean one can't engineer consciousness by mimicking states of the organ that probably has something to do with it?

What does phenomenal transparency mean?

[–] Hackworth@lemmy.world 3 points 15 hours ago (2 children)

Plus, privacy of consciousness may just be a technological hurdle.

[–] tabular@lemmy.world 1 points 4 hours ago

We may be able to tell with great confidence what you're thinking or feeling but not how it feels to you. There's a subjective, 1st person experience. Something that it's like to be you which is different from me. I can't tell what it's like to be someone else, or be another animal, or if it means anything to be a rock.

[–] kibiz0r@midwest.social 2 points 13 hours ago

Privacy doesn't mean that nobody can tell what you're thinking. It means that you will always be more justified in believing yourself to be conscious than in believing others are conscious. There will always be an asymmetry there.

Replaying neural activity is impressive, but it doesn't prove the original recorded subject was conscious quite as robustly as my daily subjective experience proves my own consciousness to myself. For example, you could conceivably fabricate an entirely original neural recording of a person who never existed at all.

[–] kibiz0r@midwest.social 1 points 13 hours ago (1 children)

I added some episodes of Walden Pod to my comment, so check those out if you wanna go deeper, but I'll still give a tl;dl here.

Privacy of consciousness is simply that there's a permanent asymmetry of how well you can know your own mind vs. the minds of others, no matter how sophisticated you get with physical tools. You will always have a different level of doubt about the sentience of others, compared to your own sentience.

Phenomenal transparency is the idea that your internal experiences (like what pain feels like) are "transparent", where transparency means you can fully understand something's nature through cognition alone and not needing to measure anything in the physical world to complete your understanding. For example, the concept of a triangle or that 2+2=4 are transparent. Water is opaque, because you have to inspect it with material tools to understand the nature of what you're referring to.

You probably immediately have some questions or objections, and that's where I'll encourage you to check out those episodes. There's a good reason they're longer than 5 sentences.

[–] tabular@lemmy.world 2 points 10 hours ago* (last edited 10 hours ago) (1 children)

I thought that's what was ment by privacy of consciousness and agree that's how it is.

However, being unable to inspect if something has a consciousness doesn't mean we can't create a being which does. We would be unaware if we actually succeeded, or if it even happened unintentionally with some other goal in mind.

[–] kibiz0r@midwest.social 1 points 7 hours ago (1 children)

Gotcha. Yeah, I can endorse that viewpoint.

To me, “engineer” implies confidence in the specific result of what you’re making.

So like, you can produce an ambiguous image like The Dress by accident, but that’s not engineering it.

The researchers who made the Socks and Crocs images did engineer them.

[–] tabular@lemmy.world 2 points 5 hours ago

I see what you mean. By that definition of engineer then I would agree.

We could perhaps engineer androids that mimic us so well that to damage them would feel to us like hurting a human. I would feel compelled to take the risk of caring for an unfeeling simulation just in case they were actually able to suffer or flourish.

[–] 10001110101@lemm.ee 2 points 15 hours ago* (last edited 15 hours ago) (1 children)

Lol. This comment sent me down a rabbit hole. I still don't know if it's logically correct from a non-physicalist POV, but I did come to the conclusion that I lean toward eliminative materialism and illusionism. Now I don't have to think about consciousness anymore because it's just a trick our brains play on us (consciousness always seemed poorly defined to me anyways).

I guess when AI appears to be sufficiently human or animal-like in its cognitive abilities and emotions, I'll start worrying about its suffering.

[–] kibiz0r@midwest.social 1 points 14 hours ago* (last edited 5 hours ago)

If you wanna continue down the rabbit hole, I added some good stuff to my original comment. But if you're leaning towards epiphenomenalism, might I recommend this one: https://audioboom.com/posts/8389860-71-against-epiphenomenalism

Edit: I thought of another couple of things for this comment.

You mentioned consciousness not being well-defined. It actually is, and the go-to definition is from 1974. Nagel’s “What Is It Like to Be a Bat?”

It’s a pretty easy read, as are all of the essays in his book Mortal Questions, so if you have a mild interest in this stuff you might enjoy that book.

Very Bad Wizards has at least one episode on it, too. (Link tbd)

Speaking of Very Bad Wizards, they have an episode about sex robots (link tbd) where (IIRC) they talk about the moral problems with having a convincing human replica that can’t actually consent, and that doesn’t even require bringing consciousness into the argument.