this post was submitted on 06 Aug 2023
462 points (94.3% liked)

Technology

59422 readers
2942 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New acoustic attack steals data from keystrokes with 95% accuracy::A team of researchers from British universities has trained a deep learning model that can steal data from keyboard keystrokes recorded using a microphone with an accuracy of 95%.

you are viewing a single comment's thread
view the rest of the comments
[–] Obsession@lemmy.world 51 points 1 year ago (1 children)

That's pretty much what the article says. The model needs to be trained on the target keyboard first, so you won't just have people hacking you through a random zoom call

[–] bdonvr@thelemmy.club 19 points 1 year ago (1 children)

And if you have the access to train such a model, slipping a keylogger onto the machine would be so much easier

[–] jumperalex@lemmy.world -1 points 1 year ago (1 children)

Hmmm not totally. A bad actor could record the keyboard and then figure out a way to get it installed. Either through a logistics attack (not everyone maintains a secure supply chain), or an insider threat installing it. Everyone's trained not to allow thumb drives and the like. But a 100% completely unaltered bog standard keyboard brought into a building is probably easier, and for sure less suspicious if you get caught.

Sure you might say, "but if you have an insider you've already lost" to which I say, your insider is at risk if they do certain things. But once this keyboard is installed, their own detection risk is less.

Now the question is, how far away can the mic be? Because that's gonna be suspicious AF getting that installed. BUT!!! this is still a great way to break the air gap.

[–] ItsMeSpez@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

A bad actor could record the keyboard and then figure out a way to get it installed

The room is important to the training of the model as well. So even if you know the make and model of the keyboard, the exact acoustic environment it is in will still require training data.

Also if you can install a keyboard of your choosing, you can just put the keylogger inside the keyboard. If you're actually getting your own peripherals installed on your target machine, training a model to acoustically compromise your target is the most difficult option available to you.

[–] jumperalex@lemmy.world 1 points 1 year ago

good point about the room.

as for an installed keylogger, there are organizations that will inspect for that and catch it. My point is this is a way to get an actually unmolested USB device into play.

But I hear you, this isn't likely an ideal option right now, but it is an option for maybe some niche case. And these are early days, put enough funding behind it and it might become more viable. Or not. Mostly I'm just offering the thought that there ARE use cases if someone puts even a moment's creative thought into trade craft and the problems it might solve like breaking the air gap, emplacement, avoiding detection, and data exfil. Each of those are problems to be solved at various levels of difficulty depending on the exact target.