UraniumBlazer

joined 2 years ago
[–] UraniumBlazer@lemm.ee -1 points 6 months ago (3 children)

A conscious system has to have some baseline level of intelligence that's multiple orders of magnitude higher than LLMs have.

Does it? By that definition, dogs aren't conscious. Apes aren't conscious. Would you say they both aren't self aware?

If you're entertained by an idiot "persuading" something less than an idiot, whatever. Go for it.

Why the toxicity? U might disagree with him, sure. Why go further and berate him?

[–] UraniumBlazer@lemm.ee 4 points 6 months ago (1 children)

vocal chloroform.

Haha I'm stealing that

[–] UraniumBlazer@lemm.ee -5 points 6 months ago (11 children)

Exactly. Which is what makes this entire thing quite interesting.

Alex here (the interrogator in the video) is involved in AI safety research. Questions like "do the ethical frameworks of AI match those of humans", "how do we get AI to not misinterpret inputs and do something dangerous" are very important to be answered.

Following this comes the idea of consciousness. Can machine learning models feel pain? Can we unintentionally put such models into immense eternal pain? What even is the nature of pain?

Alex demonstrated that ChatGPT was lying intentionally. Can it lie intentionally for other things? What about the question of consciousness itself? Could we build models that intentionally fail the Turing test? Should we be scared of such a possibility?

Questions like these are really interesting. Unfortunately, they are shot down immediately on Lemmy, which is pretty disappointing.

[–] UraniumBlazer@lemm.ee 0 points 6 months ago

Who is to blame? The fascists - IDF and Hamas and those who fan the flames of this war.

[–] UraniumBlazer@lemm.ee 0 points 6 months ago* (last edited 6 months ago) (1 children)

Except the fact that Israel IS commiting genocide in Israel.

I trust the International Court of Justice more than you.

view more: ‹ prev next ›