this post was submitted on 20 Apr 2025
232 points (93.6% liked)

memes

14398 readers
2822 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] sem@lemmy.blahaj.zone 2 points 3 days ago* (last edited 3 days ago) (1 children)

To me that's so cringe because I've tried it out for explaining concepts and when I take that information and try to use it, it is confidently wrong so much of the time.

The one thing it has helped me with is when I'm trying to do some system administration task, where traditional search engine results are old forum entries or out of date documentation, llms can suggest a way to do the task, and then I can follow those breadcrumbs and do real research on how to do what I need to do.

I was trying to think of use cases for it. honestly if you just want a general overview of a topic, the hallucinations dont affect it too much