this post was submitted on 12 Aug 2024
140 points (97.9% liked)

Fuck AI

1370 readers
13 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS
 

Microsoft raced to put generative AI at the heart of its systems. Ask a question about an upcoming meeting and the company’s Copilot AI system can pull answers from your emails, Teams chats, and files—a potential productivity boon. But these exact processes can also be abused by hackers.

At the Black Hat security conference in Las Vegas, researcher Michael Bargury is demonstrating five proof-of-concept ways that Copilot, which runs on its Microsoft 365 apps, such as Word, can be manipulated by malicious attackers, including using it to provide false references to files, exfiltrate some private data, and dodge Microsoft’s security protections.

One of the most alarming displays, arguably, is Bargury’s ability to turn the AI into an automatic spear-phishing machine. Dubbed LOLCopilot, the red-teaming code Bargury created can—crucially, once a hacker has access to someone’s work email—use Copilot to see who you email regularly, draft a message mimicking your writing style (including emoji use), and send a personalized blast that can include a malicious link or attached malware.

you are viewing a single comment's thread
view the rest of the comments
[–] N0body@lemmy.dbzer0.com 29 points 2 months ago (1 children)

The schadenfreude is still palpable with every “AI turns out to be billion-dollar snake oil” story, especially the extra spicy ones like this.

The wealthy people who run the world are sociopathic morons. Tell them you have a miracle way to fire all their human workers, and they will give you unlimited money and trust.

[–] LadyMeow@lemmy.blahaj.zone 13 points 2 months ago

Except its not really snake oil, its more like a dystopia machine