Lemmy Today

1,493 readers
72 users here now

Welcome to lemmy.today!

About us

🤗 Thanks for joining our little instance here, located in Oregon. The idea is to have a fast, stable instance and allow users to subscribe to whatever content they want from here.

😎 We dont block any other instances. We will keep it that way unless it becomes a moderation problem.

🤠 We will be around for a very long time, so you dont have to worry about us shutting down the instance anytime soon. We like performance and stability in our servers, and will upgrade the instance when its needed.

🥹 Make sure to join a lot of remote communities to get a good feed going. How to do that is explained here.

Lemmy mobile apps

You should start using one of these ASAP since the web browser user interface is quite ugly, even with themes.

Optional Lemmy web browser user interfaces

Rules

Contact the admin

founded 1 year ago
ADMINS
1
 
 

Microsoft raced to put generative AI at the heart of its systems. Ask a question about an upcoming meeting and the company’s Copilot AI system can pull answers from your emails, Teams chats, and files—a potential productivity boon. But these exact processes can also be abused by hackers.

At the Black Hat security conference in Las Vegas, researcher Michael Bargury is demonstrating five proof-of-concept ways that Copilot, which runs on its Microsoft 365 apps, such as Word, can be manipulated by malicious attackers, including using it to provide false references to files, exfiltrate some private data, and dodge Microsoft’s security protections.

One of the most alarming displays, arguably, is Bargury’s ability to turn the AI into an automatic spear-phishing machine. Dubbed LOLCopilot, the red-teaming code Bargury created can—crucially, once a hacker has access to someone’s work email—use Copilot to see who you email regularly, draft a message mimicking your writing style (including emoji use), and send a personalized blast that can include a malicious link or attached malware.

2
 
 

Attacks on Microsoft’s Copilot AI allow for answers to be manipulated, data extracted, and security protections bypassed, new research shows.

view more: next ›