news
Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.
Rules:
-- PLEASE KEEP POST TITLES INFORMATIVE --
-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --
-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --
-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today/ . Twitter screenshots still need to be sourced or they will be removed --
-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--
-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--
-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --
-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --
view the rest of the comments
Why? Why wouldn't they? The way an animal experiences pain isn't magically different to an artificial construct by virtue of the neurons and synapses being natural instead of artificial. A pain response is a negative feeling that exists to make a creature avoid behaviours that are detrimental to its survival. There's no real reason that this shouldn't be reproducible artificially or the artificial version be regarded as "less" than the natural version.
Not that I think LLMs are leading to meaningful real sentient AI but that's a whole different topic.
B/c they're machines without pain receptors. It's kind of biology 101 but science has been totally erased in this "AI" grift.
A "pain receptor" is just a type of neuron. These are neural networks made up of artificial neurons.
Neural networks are a misnomer. They have very little if anything to do with actual neurons.
This situation is like adding a face layer onto your graphics rendering in a game engine and setting it so the face becomes pained when the fps drops and becomes happy when the fps is high. Then tracking if that facial system increases fps performance as a test to see if your game engine is sentient.
it is a fancy calculator. It is using its neural network to calculate fancy math just like a modern video game engine. Making it output a text response related to pain is just the same as adding a face on the HUD, except the video game example is actually quantified to something, whereas the LLM is just keeping the 'pain meter' in its input context it uses to calculate a text response with.
It’s never going to happen because we’re never going to make a program even close to actually resembling an animal brain. “AI” is a grift.
Plus this is kind of oversimplifying it. You could do that with just traditional programming and no kind of neural network. Like you could make a dog training game/simulator and (you shouldn’t but you could) add the ability to inflict “pain” to discourage the computer dog from unwanted behaviors. That fits your definition but the dog is very clearly just a computer program not “experiencing” anything. It could literally just be
onHit() = peeOnFloor -= 1
.I don't think we know enough about the brain to say that for certain. It could operate in ways fundamentally different from a computer.
I intuit that an artificial, digital consciousness is going to have a different material reality from our own[1]. Therefore it's consciousness wouldn't be dependent on its mimicry of our own. Like how organic molecules can have silicone as a base instead of carbon, but our efforts in space center around finding "life as we know it" instead of these other types of life. Digital sentience wouldn't be subject to evolutionary pressures in my mind. I'd sooner try to measure for creativity and curiosity. The question would be whether the entity is capable of being its own agent in society - able to make its own decisions and deal with the consequences.
[1] as opposed to that artificial jellyfish