this post was submitted on 10 Feb 2024
171 points (96.2% liked)
Technology
59392 readers
2523 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It used to be you'd search for something, click on the results and load the ads on the page with the info.
Then google started adding their snippets with direct answers, and yes, there has been an uproar from content sites about that. But some fraction of people still click through for more context.
With LLMs, all that traffic is 100% gone.
Eh, I might ask the LLM about something, but I always open it sources to verify it summaries. You still can't trust them fully.
That's just kicking the can down the road. They'll be exactly as trustworthy as your own brain at summarizing articles soon. What then?
I still want to know the source of what I'm being told. There are plenty of brains out there smarter than mine, I'll still ask for sources.
There is a reason why RAG and fine-tuning are big topics in the field. General foundation models are good for general low risk info, but if people really care its generally not enough.
Unfortunately, most people don't care. That's why most get their news from Facebook or TikTok, and only read headlines.
I mean I don't care for most things, we don't all need to experts on the topic of the week imho