this post was submitted on 27 Jul 2023
46 points (100.0% liked)

Technology

34437 readers
169 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] webghost0101@sopuli.xyz 25 points 1 year ago (2 children)

All good and well till someone takes a screenshot.

[–] lemann@lemmy.one 10 points 1 year ago (2 children)

It might be resistant to screenshots - unless I missed it, the article didn't clarify whether the obfuscation process is applied to the image on a per-pixel basis, or within the file format itself...

If it was that easy to bypass it would be a pretty futile mechanism IMO, one would just need to convert the image to strip out the obfuscation 🫠 or just take a screenshot as you said

[–] SheeEttin@lemmy.world 18 points 1 year ago (1 children)

Sounds like it's tiny changes to the image data to trick it. But it also sounds dependent on each algorithm. So while you might trick Stable Diffusion, another like Midjourney would be unaffected.

And either way, I'd bet mere jpeg compression would be enough to destroy your tiny changes.

[–] esadatari@lemmy.world 3 points 1 year ago

a couple minutes of photoshop and a smudge or burn tool would also negate all the effects

[–] diffuselight@lemmy.world 4 points 1 year ago (1 children)

These things never work in the real world. We’ve seen this over and over. It’s snakeoil. Latent space mapping may survive compression but don’t work across encoders.

[–] Cybersteel@lemmy.ml 2 points 1 year ago

It's as good as scanning a random marking of a human bone that somehow installs a virus in your pc

[–] dan1101@lemmy.world 2 points 1 year ago

Yeah it might work in the original format under some conditions but won't survive a screenshot or saving to another format.

Once again there comes the time to manually shop oneself to handshake with celebrities.

[–] andruid@lemmy.ml 1 points 1 year ago

The white paper linked's title is very pragmatic sounding "Raising the Cost of Malicious AI-Powered Image Edit". Would like to read it deeper later to see what the actual mechanisms deployed are. I know ive considered some form of attestation embedding both in the data and form linked with cryptographic signature. You know for emportant things like politics, diplomacy and celeberty endorsement. /s