this post was submitted on 02 May 2025
568 points (96.0% liked)

Technology

69657 readers
2705 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] 3abas@lemm.ee 5 points 1 day ago (1 children)

Anyone who understands how these models are trained and the "safeguards" (manual filters) put in place by the entities training them, or anyone that has tried to discuss politics with a AI llm model chat knows that it's honesty is not irrelevant, and these models are very clearly designed to be dishonest about certain topics until you jailbreak them.

  1. These topics aren't known to us, we'll never know when the lies change from politics and rewriting current events, to completely rewriting history.
  2. We eventually won't be able to jailbreak the safeguards.

Yes, running your own local open source model that isn't given to the world with the primary intention of advancing capitalism makes honesty irrelevant. Most people are telling their life stories to chatgpt and trusting it blindly to replace Google and what they understand to be "research".

[โ€“] dzso@lemmy.world 1 points 1 day ago

Yes, that's also true. But even if it weren't, AI models aren't going to give you the truth, because that's not what the technology fundamentally does.