this post was submitted on 14 Aug 2024
-62 points (17.0% liked)

Technology

60081 readers
3334 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] UraniumBlazer@lemm.ee -1 points 4 months ago (1 children)

A conscious system has to have some baseline level of intelligence that's multiple orders of magnitude higher than LLMs have.

Does it? By that definition, dogs aren't conscious. Apes aren't conscious. Would you say they both aren't self aware?

If you're entertained by an idiot "persuading" something less than an idiot, whatever. Go for it.

Why the toxicity? U might disagree with him, sure. Why go further and berate him?

[–] conciselyverbose@sh.itjust.works 6 points 4 months ago (1 children)

No, that definition does not exclude dogs or apes. Both are significantly more intelligent than an LLM.

Pseudo-intellectual bullshit like this being spread as adding to the discussion does meaningful harm. It's inherently malignant, and deserves to be treated with the same contempt as flat earth and fake medicine should be.

[–] UraniumBlazer@lemm.ee -3 points 4 months ago (1 children)

No, that definition does not exclude dogs or apes. Both are significantly more intelligent than an LLM.

Again, depends on what type of intelligence we are talking about. Dogs can't write code. Apes can't write code. LLMs can (not bad code in my experience for low level tasks). Dogs can't summarize huge pages of text. Heck, they can't even have a vocab greater than a few thousand words. All of this definitely puts LLMs above dogs n apes in the scale of intelligence.

Pseudo-intellectual bullshit like this being spread as adding to the discussion does meaningful harm. It's inherently malignant, and deserves to be treated with the same contempt as flat earth and fake medicine should be.

Your comments are incredibly reminiscent of self righteous Redditors. U make bold claims without providing any supporting explanation. Could you explain how any of this is pseudoscience? How does any of this not follow the scientific method? How is it malignant?

[–] conciselyverbose@sh.itjust.works 4 points 4 months ago* (last edited 4 months ago)

Spitting out sequences of characters shaped like code that arbitrarily may or may not work when you're a character generator that does nothing but randomly imitate the patterns of other characters that are similar isn't "intelligence". Language skills are not a prerequisite to intelligence. And calling what LLMs do language skills is already absurdly generous. They "know" what sentences look like. They can't reason about language. They can't solve linguistic puzzles unless the exact answers are already in their dataset. They're parrots (except parrots actually do have some intelligence, ignoring the blind word sounds).

There is no more need for deep explanation with someone who very clearly doesn't know the very basics than there is to explain a round earth to a flat earther. Pretending a "discussion" between a moron trying to reason with a random word generator and the random word generator is useful is the equivalent of telling me about how great the potentization worked on your homeopathic remedy. It's a giant flare that there is no room for substance.