this post was submitted on 19 Feb 2025
110 points (99.1% liked)

Privacy

901 readers
82 users here now

Protect your privacy in the digital world

Welcome! This is a community for all those who are interested in protecting their privacy.

Rules

PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!

  1. Be nice, civil and no bigotry/prejudice.
  2. No tankies/alt-right fascists. The former can be tolerated but the latter are banned.
  3. Stay on topic.
  4. Don't promote proprietary software.
  5. No crypto, blockchain, etc.
  6. No Xitter links. (only allowed when can't fact check any other way, use xcancel)
  7. If in doubt, read rule 1

Related communities:

founded 3 months ago
MODERATORS
top 19 comments
sorted by: hot top controversial new old
[–] _cryptagion@lemmy.dbzer0.com 2 points 2 days ago

What? But the tankies assured me this wouldn’t happen!

I am zero surprised.

[–] fxomt@lemmy.dbzer0.com 30 points 4 days ago (2 children)

No way... you're telling me a free AI is profiting off my data?

Always run AI locally!

[–] FarraigePlaisteach@lemmy.world 5 points 4 days ago (1 children)

Is that feasible for someone with an office PC with integrated graphics? Asking for a friend.

[–] BakedCatboy@lemmy.ml 4 points 4 days ago (1 children)

If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won't fit anything useful in it's vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.

I have 64G and I run 8-14b models. 32b is pushing it (it's just really slow)

[–] 30p87@feddit.org 2 points 4 days ago (1 children)

Don't iGPUs use the RAM as VRAM directly? You'd only need to configure how much in the BIOS (eg. by default it uses 1.5GB of 8GB or smth and you can set it to 6/8GB)

[–] BakedCatboy@lemmy.ml 2 points 4 days ago* (last edited 4 days ago)

Yes for gaming, but for LLMs I've heard that the bandwidth limitations of using system RAM as vram hurts performance worse than running on the CPU using system memory directly, since smaller models are more memory bandwidth limited.

I've never tried to run AI on an igpu with system memory though so you could try it, assuming it will let you allocate like 32GB or more like 64GB. I think you'll also need a special runner that supports igpus.

[–] jonne@infosec.pub 2 points 4 days ago

Yeah, AI is even being trained in data provided by the Nazi Steve Huffman's website.

[–] Anarki_@lemmy.blahaj.zone 25 points 4 days ago (1 children)
[–] asudox@lemmy.asudox.dev 3 points 4 days ago

when does it end

[–] ech@lemm.ee 11 points 4 days ago

What!? What a complete and utter shocker!!

Tbf, I don't use any of these corporate llms for exactly that reason. At best, they just use user interaction to "improve" the models, and they're more likely using it to profile and track them as well. Fuck that.

[–] Coldmoon@sh.itjust.works 8 points 4 days ago

Surprised pikachu face.

[–] ObsidianZed@lemmy.world 5 points 4 days ago (2 children)

Everyone had it wrong! It was the Chinese Government stealing your data to give to TikTok!

[–] LWD@lemm.ee 2 points 4 days ago

"I only care if America has my data"

DeepSeek > TikTok > Oracle > Ellison > America

[–] zombiewarrior@social.vivaldi.net 4 points 4 days ago (1 children)
[–] Captainautism@lemmy.dbzer0.com 3 points 4 days ago (1 children)
[–] gofsckyourself@lemmy.world 4 points 4 days ago (1 children)

Comments from some federated sources always add the username of the user they are replying to. It's one of the things I really hate from cross-federation.

Oooh, I didn’t know that. Thanks for explaining.