this post was submitted on 07 Aug 2025
83 points (100.0% liked)

Technology

73905 readers
3601 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The National Institute of Standards and Technology conducted a groundbreaking study on frontier models just before Donald Trump's second term as president—and never published the results.

top 3 comments
sorted by: hot top controversial new old
[–] sp3ctr4l@lemmy.dbzer0.com 12 points 4 days ago* (last edited 4 days ago)

Probably all you need to know is that when you see industry conferences about AI and CyberSecurity?

Yeah, they're not about how to use AI to improve security with neat, new heuristic detection methods, and automated response scenarios.

They are about all the extra work you have to do, all the extra things you now need to be aware of and worried about, because AI so routinely introduces so many holes and exploits and flaws ... in so many places that you normally wouldn't think to check, because surely any person or team putting out that terrible of code would have been fired, right?

Beyond the methods one can use to 'trick' AI into doing things it isn't 'supposed to do'... mass AI adoption by large swathes of the economy is just literally a national security threat, it fundamentally compromises the security and integrity of tech infrastructure that now undergirds basically everything.

[–] homesweethomeMrL@lemmy.world 5 points 4 days ago* (last edited 4 days ago)

“It became very difficult, even under [president Joe] Biden, to get any papers out,” says a source who was at NIST at the time. “It felt very like climate change research or cigarette research.”

Before taking office, President Donald Trump signaled that he planned to reverse Biden’s Executive Order on AI. Trump’s administration has since steered experts away from studying issues such as algorithmic bias or fairness in AI systems. The AI Action plan released in July explicitly calls for NIST’s AI Risk Management Framework to be revised “to eliminate references to misinformation, Diversity, Equity, and Inclusion, and climate change.”

[–] wewbull@feddit.uk 5 points 4 days ago