this post was submitted on 31 Jan 2025
340 points (94.7% liked)
Open Source
32337 readers
1121 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
- !libre_culture@lemmy.ml
- !libre_software@lemmy.ml
- !libre_hardware@lemmy.ml
- !linux@lemmy.ml
- !technology@lemmy.ml
Community icon from opensource.org, but we are not affiliated with them.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this is obviously talking about their web app, which most people will be using. In this special instance, it was clearly not the LLM itself censoring the Tiananmen Square, but a layer on top.
i have not bothered downloading and asking deepseek about Tiananmen Square. so i cannot know what the model would have generated. however, it is possible that certain biasses are trained into any model.
i am pretty sure, this blog is aimed at the average user. while i wouldn't trust any LLM company with my data, i certainly wouldn't want the chinese government to have them. anyone that knows how to use (ollama)[https://github.com/ollama/ollama] should know these telemetry data don't apply to running locally. but for sure, pointing it out in the blog would help.
@ToxicWaste @JOMusic the censorship is trained into the ollama models too. But of course the self-hosted model cannot send anything to China, so at least the whole tracking issue is avoided.