I don’t know how there aren’t a myriad of problems associated with attempting to emulate the brain, especially with the end goal of destroying livelihoods and replacing one indentured servant for another. In fact, that’s what promoted this post- an advertisement for a talk with my alma mater’s philosophy department asking what happens when see LLMs discover phenomenological awareness.
I admit that I don’t have a ton of formal experience with philosophy, but I took one course in college that will forever be etched into my brain. Essentially, my professor explained to us the concept of a neural network and how with more computing power, researchers hope to emulate the brain and establish a consciousness baseline with which to compare a human’s subjective experience.
This didn’t use to be the case, but in a particular sector, most people’s jobs are just showing up a work, getting on a computer, and having whatever (completely unregulated and resource devouring) LLM give them answer they can find themselves, quicker. And shit like neuralink exists and I think the next step will to be to offer that with a chatgpt integration or some dystopian shit.
Call me crazy, but I don’t think humans are as special as we think we are and our pure arrogance wouldn’t stop us from creating another self and causing that self to suffer. Hell, we collectively decided to slaughter en masse another collective group with feeling (animals) to appease our tastebuds, a lot of us are thoroughly entrenched into our digital boxes because opting out will result in a loss of items we take for granted, and any discussions on these topics are taboo.
Data-obsessed weirdos are a genuine threat to humanity, consciousness-emulation never should have been a conversation piece in the first place without first understanding its downstream implications. Feeling like a certified Luddite these days
i think the through line from "sufficiently advanced computer + software" to "conscious brain akin to a human one" is basically made up nonsense conceived by SF writers and propagated as being actually real by tech bros. i don't think it's something worth taking seriously at all
Human consciousness is an emergent property of neurons firing in our brain. Unless you attribute consciousness to some external mystical force, replicating it should theoretically be possible. I'm not saying LLMs are the path to get there or that we are anywhere close to it, but it seems inevitable that it is eventually achieved.
i personally do believe in the human soul and don't think rationalist vulgar materialism can fully explain consciousness so yeah, I guess we may just fundamentally disagree there. it doesn't even have to be something "mystical" though, could just be something totally unknown to science that can never be replicated in silicon. even if you still think it's possible, it's plain that the current extinction event and the technological setbacks/energy crises it will bring is going to prevent much progress being made towards the currently science fiction-level technology and energy required to get even close. far from "inevitable" in my view and ultimately, a total waste of time and resources. may as well say Dyson spheres another thing made up by SF writers are inevitable. energy crises, tech setbacks and population destruction will always get in the way. it's utopian to a cartoonish extent, like hundreds or thousands of years of end stage communism would be needed for this kind of stuff to even begin being feasible. and if we had that then I would hope creating AI slaves wouldn't be very high on the agenda. that's why I think taking it seriously is a waste of time.
Any thoughts on brain organoid computers related to this?