this post was submitted on 02 Feb 2025
230 points (97.1% liked)

United States | News & Politics

2211 readers
1478 users here now

Welcome to !usa@midwest.social, where you can share and converse about the different things happening all over/about the United States.

If you’re interested in participating, please subscribe.

Rules

Be respectful and civil. No racism/bigotry/hateful speech.

Post anything related to the United States.

founded 2 years ago
MODERATORS
all 49 comments
sorted by: hot top controversial new old
[–] Ascrod@midwest.social 2 points 1 day ago

Lol. Lmao. Let them try to enforce it.

[–] spujb@lemmy.cafe 8 points 1 day ago

so-called ”free market capitalists” when someone made a better product:

[–] trevor@lemmy.blahaj.zone 13 points 2 days ago (2 children)

I had zero interest in downloading this shit before because LLMs are just lying slop machines, but if it becomes illegal, I will go out of my way to.

[–] BreadstickNinja@lemmy.world 2 points 1 day ago (3 children)

I'm with you on principle, but the thing is also like 600 GB of data. Not sure I have the disk space to take a stand on this one.

[–] LifeInMultipleChoice@lemmy.dbzer0.com 1 points 1 day ago (1 children)

There are no lite versions? I was trying to find a small LLM version I can run on an old machine and take it off the internet (or just firewall it) and play around with it to see if there is anything worth learning there for me. I was looking at the lite version of llama but when I tried to run the install on mint I ran into some issues and then had to many drinks to focus on it so I went back to something else. Maybe next weekend. If you have any recommendations I'm all ears

[–] BreadstickNinja@lemmy.world 1 points 1 day ago

There are finetunes of Llama, Qwen, etc., based on DeepSeek that implement the same pre-response thinking logic, but they are ultimately still the smaller models with some tuning. If you want to run locally and don't have tens of thousands to throw at datacenter-scale GPUs, those are your best option, but they differ from what you'd get in the Deepseek app.

[–] trevor@lemmy.blahaj.zone 2 points 1 day ago (1 children)

No worries. I've got some 20TB drives i can throw these on 😁

Now, running a model that large... Well, I'll just have to stick with the 8-13b params.

[–] pupbiru@aussie.zone 1 points 1 day ago

you can actually run the model, but it just goes very slowly… i run the 70b model on my m1 mbp, and it technically “requires” 128gb or VRAM - it still runs, just not super fast (though i’d say it’s useable in this case a about 1 word per ~300ms)

[–] Kolanaki@pawb.social 1 points 1 day ago

I'd have to remove all my games, the operating system, and run my SSDs in RAID to be able to fit that and still generate things with it. 😮

[–] hmmm@sh.itjust.works 5 points 1 day ago

Umm What the actual fuck. Americans are doing.

[–] Arcturus@lemmy.dbzer0.com 61 points 2 days ago (2 children)

They're gonna have a hard time making it illegal to download a completely open source software lol

[–] Ensign_Crab@lemmy.world 26 points 2 days ago (2 children)

Do not tempt them to outlaw open source.

[–] SpaceNoodle@lemmy.world 37 points 2 days ago

Facebook already tipped their hand by prematurely banning posts about Linux.

[–] TaiCrunch@sh.itjust.works 2 points 1 day ago* (last edited 1 day ago)

The most they could do is try to arbitrarily make the licenses null and void, but there's no functional way to outlaw making code publicly available without also outlawing the entirety of HTML, CSS, and JavaScript.

[–] Emotional@lemmy.blahaj.zone 13 points 2 days ago (3 children)

Unfortunately, as I've learned recently, it doesn't look like Deepseek is actually open source.

You can download the model, but unless I'm misunderstanding, that feels comparable to calling Photoshop open source because you can download the .exe file on your computer.

[–] save_the_humans@leminal.space 13 points 2 days ago (1 children)

Its MIT licensed. Meaning the code is open but the license is permissible in that copy's can be subsequently closed. This is unlike with the GPL most generally associated with open source code.

[–] frezik@midwest.social 11 points 2 days ago

The weights are MIT licensed. The code is, too, but code for these things are uninteresting.

The training data is not open source, and that's the interesting part of a model.

You can reweight as you please to whatever dataset you like. They can say what the training data included, but they can't share the dataset.

[–] andMoonsValue@lemmy.world 4 points 2 days ago (1 children)
[–] Emotional@lemmy.blahaj.zone 5 points 2 days ago* (last edited 2 days ago) (1 children)

This comment here seems to summarize it well: https://github.com/deepseek-ai/DeepSeek-V3/issues/457#issuecomment-2627016777

It's more open-sourced than I thought, but also seems debatable. I don't know enough about LLMs to properly judge. I would probably stay away from calling it "completely open-sourced" though.

[–] andMoonsValue@lemmy.world 1 points 2 days ago (1 children)

Fair, but they gave quite a bit when compared to other leaders in the space such as "open"ai. I'm exciting for this project I stumbled across - https://github.com/huggingface/open-r1. There are gaps in what deepseek provided but others are trying to fill them in.

[–] Emotional@lemmy.blahaj.zone 2 points 2 days ago

Of course! I think I've been particularly cynical about stuff being named open source because of OpenAI.

I use LLMs through Perplexity and GitHub CoPilot all the time, but I'm still too spiteful and petty to use anything from "Open"AI. I've been very happy with R1 so far.

[–] TheReturnOfPEB@reddthat.com 75 points 2 days ago* (last edited 2 days ago)

Oh I thought we just did Executive Orders these days. Interesting for Congress to want to be in the pocket of the three richest men in the United States, too.

[–] mkwt@lemmy.world 18 points 2 days ago (2 children)

These models are mostly giant tables of weights that run on standardized framework software, right?

We're talking about making illegal numbers again, aren't we?

09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0

[–] shirro@aussie.zone 3 points 1 day ago

Were going to need a bigger shirt.

[–] threshold_dweller 3 points 1 day ago

Its funny what happened with AACS, they didnt learn.

DeCSS was the same before that. They sure taught us, eh? 🖤🏴‍☠️🖤

[–] Kazumara@discuss.tchncs.de 11 points 2 days ago (1 children)

Since I don't see the Bill on the congress website for some reason, here is what the dimwit senator published on his own section:

https://www.hawley.senate.gov/wp-content/uploads/2025/01/Hawley-Decoupling-Americas-Artificial-Intelligence-Capabilities-from-China-Act.pdf

[–] TaiCrunch@sh.itjust.works 4 points 1 day ago (1 children)

Of course it's Josh Hawley.

[–] VitoRobles 1 points 1 day ago* (last edited 1 day ago)

Ah yeah, Josh Hawley. I still think of him as the fragile little snowflake](https://www.npr.org/2021/11/11/1054615028/is-masculinity-under-attack-sen-hawley-wants-to-defend-the-men-of-america) who got his feelings hurt because he thinks "masculinity" is under attack. Even wrote a book about it. Imagine thinking that deeply about men.

[–] zkfcfbzr@lemmy.world 55 points 2 days ago (1 children)

Well now I'm just gonna download it even harder.

[–] SVcross@lemmy.world 4 points 2 days ago

I will download and upload it everywhere I can, if this gains traction.

[–] Chozo@fedia.io 32 points 2 days ago (2 children)

Because prohibition works, right?

[–] AtHeartEngineer@lemmy.world 1 points 1 day ago

Prohibition of technology import is insane, how fucking moronic.

[–] Showroom7561@lemmy.ca 1 points 1 day ago

Nazis gotta try.

[–] RizzRustbolt@lemmy.world 25 points 2 days ago

Yes, that will definitely put the toothpaste back.

[–] Chivera@lemmy.world 21 points 2 days ago (1 children)

Is this the bill that will finally bring egg prices down?

No this one is just going to prop up nvidias value for a few more months.

[–] ShimmeringKoi@hexbear.net 20 points 2 days ago* (last edited 2 days ago)

In evil badcountry, the regime makes it a crime simply to posess things from Outside

[–] JeeBaiChow@lemmy.world 14 points 2 days ago

I guess competition isn't a thing in the us anymore. What a bunch of pussies.

[–] FuckyWucky@hexbear.net 12 points 2 days ago* (last edited 2 days ago)

DeepSeek did everything right, it will not be possible to ban when anyone can run distilled versions on their own mid-high end consumer hardware and even full version.

The only way to access OpenAI o1 is through their API.

[–] happybadger@hexbear.net 11 points 2 days ago

It still makes mistakes with basic Physics 101 questions, but so far it's the only LLM I've used which gets it right 80%~ of the time instead of 30%~ with ChatGPT or Google.

[–] Rhaedas@fedia.io 10 points 2 days ago

Damn. Okay, everyone that's already grabbed a copy off of Ollama needs to upload it back.

[–] snekerpimp@lemmy.world 8 points 2 days ago (1 children)

“The downloading of deep seek is causing the prices of eggs to rise, so stop the downloading and the prices will fall!”

[–] Ascrod@midwest.social 2 points 1 day ago (1 children)
[–] snekerpimp@lemmy.world 2 points 1 day ago

“Hey kids, I’m a computer!”

[–] The_sleepy_woke_dialectic@hexbear.net 8 points 2 days ago* (last edited 2 days ago) (1 children)

Why doesn't "Open"AI just train gpt5 with deepseek, and reclaim their stock valuation?

[–] frezik@midwest.social 4 points 2 days ago

Because their premise is that they have an unassailable hardware advantage in the AI space. Deepseek just offered something on par with them while running on hardware that a mid-sized business could put together. OpenAI isn't special anymore.

[–] buh@hexbear.net 4 points 2 days ago

They didn't even give it a snappy acronym name, this country can't do anything right anymore 😒