no, it fucking isn't. (see the postscript in linked article.)
mawhrin
“(…) perception, attention, thought, imagination, intelligence, comprehension, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and computation, problem-solving and decision-making (…)”
i guess it comes down to a philosophical question
no, it doesn't, and it's not a philosophical question (and neither is this a question of philosophy).
the software simply has no cognitive capabilities.
LLMs know nothing. literally. they cannot.
oh my, you're such a confluence of bad takes (racist, transphobic, creepy and ignorant of the technical and biological topics you're pontificating about.)
preview:
reality:
“how to inflate your event by pretending there's more to it” (tbh i wouldn't mind if esr would inflict his pleasant personality on that crowd; and of course the renowned philosopher agnes callard will attend)
ack – thanks, it's good to have a proper reality check.
if one were paranoid one would think they joined the community only to write that bloody post.
if i wouldn't know from @titotal@awful.systems that it's undoubtfully a manor house, i'd think it's a fucking castle.
(btw, did we ever hear back from the person after they decided to go full tescreal on their popular website?)
it's quite telling that you don't think that actors are “creatives” but think that “gpt-4 is a great drafter”.
well, dolstra's actions vidicated the open letter completely; also i really hate using the dismissive phrase “drama” for something that is actually a large issue with open source project governance and acceptance of blood money.
neither is end-to-end encryption, the data is not private to the service provider.