this post was submitted on 25 May 2024
820 points (97.7% liked)

Technology

59656 readers
2726 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.

you are viewing a single comment's thread
view the rest of the comments
[–] Emmie@lemm.ee 4 points 6 months ago* (last edited 6 months ago) (3 children)

Ppl anthropomorphise LLMs way too much. I get it that at first glance they sound like a living being, human even and it’s exciting but we had some time already to know it’s just very cool big data processing algo.

It’s like boomers asking me what is computer doing and referring to computer as a person it makes me wonder will I be as confused as them when I am old?

[–] barsoap@lemm.ee 3 points 6 months ago (1 children)

Oh, hi, second coming of Edgar Dijkstra.

I think anthropomorphism is worst of all. I have now seen programs "trying to do things", "wanting to do things", "believing things to be true", "knowing things" etc. Don't be so naive as to believe that this use of language is harmless. It invites the programmer to identify himself with the execution of the program and almost forces upon him the use of operational semantics.

He may think like that when using language like that. You might think like that. The bulk of programmers doesn't. Also I strongly object the dissing of operational semantics. Really dig that handwriting though, well-rounded lecturer's hand.

[–] Emmie@lemm.ee 1 points 6 months ago* (last edited 6 months ago)

Oh, hi, second coming of Edgar Dijkstra.

Don’t say those things to me. I have special snowflake disorder. I got literally high reading this when seeing a famous intelligent person has same opinion as me. Great minds… god see what you have done.

[–] FlyingSquid@lemmy.world 1 points 6 months ago

It's only going to get worse now that ChatGPT has a realistic-sounding voice with simulated emotions.

[–] OpenStars@discuss.online 1 points 6 months ago

Probably not about computers per se - like the Greatest generation knew a lot more about horses than the average person today - and similarly we know more about the things that have mattered to us over the course of our lifetimes.

What would get weird for us is if when we are retirement age - ofc we cannot ever retire, bc capitalism - and someone talks about the new horglesplort based on alien vibrations which are computer-generated from the 11th dimension of string theory and we are all like "wut!?"

fr fr no cap skibidi toilet rizz teabag

That said, humanity seems to not only have slowed down the accretion of new knowledge but actually gone backwards - children today won't live as long as boomers did, and e.g. despite being on mobile devices all day long, most don't have the foggiest clue of how computing works as in programming or even binary. So we will likely be confused in the opposite way as in "why can't you understand this?"