Effectively, the government becomes the sole purveyor of truth
Straight out of a dystopian novel. No thanks
Please don't post about US Politics.
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Logo design credit goes to: tubbadu
Effectively, the government becomes the sole purveyor of truth
Straight out of a dystopian novel. No thanks
They already tried this sort of nonsense with the encryption controls in the 90s.
No thanks.
How did that end?
Strong encryption became generally available worldwide. Attempts to export-control it didn't work.
You want someone like Trump to decide who has access? Nah
Only the government and a few permitted parties
So a government and anyone who can pay a government's fee. This isn't really fixing the problem, just putting an extra barrier in the way of any smaller org that wants to get involved.
Never mind the issue that there isn't a government that can be trusted. Do you think the world is going to be improved by making perception manipulating tech the private weapon of whatever bunch of psychopaths happen to rule at the time?
People would just buy gaming gpus for “gaming”. Then whoops, they end up working on AI. Just like they’re currently doing in china.
Ignoring the fact that there are multiple governments in the world, how could you even detect if something was made with AI? An artist who touches up their art with Stable Diffusion would probably never be "caught". The way Stable Diffusion blends and alters images in Krita isn't terribly different than the rest of the Krita toolset, only faster and easier to control.
Based on your replies, it doesn't seem like you want a discussion on the idea but that you want people to say how good of an idea this is.
Truth is, it's not. It's a half thought out idea that can't work. ASICs aren't that terribly unique that only a handful of chip manufacturers have the ability to make them. There are existing companies that can move on quickly because they already have the infrastructure and processes in place, but other chip manufacturers can enter the space.
This assumes there is no black marker or secondary marker for ASICs.
This assumes that one governments restrictions would be effective when there are companies in more than just the single country with these restrictions.
Restricting hardware also implies that hardware today (or ASICs of tomorrow) are going to stay as the tech for AI. It also hampers the R&D of this type of hardware.
It creates a barrier of entry to startups and smaller business that may use generative AI in positive ways.
It implies that the use of generative AI is inherently dangerous and needs to be regulated.
It assumes that consumer hardware wouldn't be able to match ASICs. ASICs are certainly fast, but enough consumer GPUs would match the processing power of a single ASIC.
It assumes the government is good, truthful, effective, honest, and moral.
It assumes that truth is a black and white construct.
It assumes that there will be a process to check, identify, communicate, and regulate AI generated information.
How would you feel about a law that restricts the ability to purchase hardware used for training AI?
No
Effectively, the government becomes the sole purveyor of truth
Extra no
No limiting consumer access to computer hardware.
Just no.
We still haven't recovered from early crypto crap with GPU's.
Fix the environmental rules for corpos so they can't just stand up data farms and simultaneously wreak havoc on the grid and the environment without paying the full cost to offset the damage they're doing.