this post was submitted on 19 Sep 2023
636 points (98.0% liked)

Europe

8324 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

founded 1 year ago
MODERATORS
 

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

you are viewing a single comment's thread
view the rest of the comments
[–] getoffthedrugsdude@lemmy.ml 2 points 1 year ago (2 children)

It won't, you'll just be able to verify a source

[–] taladar@feddit.de 9 points 1 year ago (2 children)

Not even that. It only allows you to verify that the source is identical to (the potentially wrong information) that was claimed at the time of recording by the person adding that information to the block chain. Blockchain, as usual, adds nothing here.

[–] devils_advocate@lemmy.ml 3 points 1 year ago

It proves that the video could not have been created at a later time.

[–] fiah@discuss.tchncs.de -2 points 1 year ago (1 children)

Blockchain, as usual, adds nothing here.

it can add trust. If there's a trusted central authority where these hashes can be stored then there's no need for a blockchain. However, if there isn't, then a blockchain could be used instead, as long as it's big and established enough that everybody can agree that the data stored on it cannot be manipulated

[–] nudnyekscentryk@szmer.info 9 points 1 year ago (1 children)

but false, nonconsensual nudes are not collectible items that need to have their authenticity proven. they are there to destroy peoples' lives. even if 99% of people seeing your nude believe you it's not authnetic, it still affects you heavily

[–] fiah@discuss.tchncs.de 5 points 1 year ago (1 children)

nonconsensual nudes are not collectible items that need to have their authenticity proven

of course not, but that's not what this comment thread is about. It's about this:

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

that's where it can be very useful to store a fingerprint of a file in a trusted database, regardless of where that database gets its trust from

[–] nudnyekscentryk@szmer.info -2 points 1 year ago (1 children)

sure, but again: why woudl anyone like to do that with consensual or nonconsensual nudes?

[–] fiah@discuss.tchncs.de 3 points 1 year ago (1 children)

that is not what this comment thread is about

[–] nudnyekscentryk@szmer.info -1 points 1 year ago (1 children)

it very much is:

OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

parent reply: Thats why we need Blockchain Technology

[–] fiah@discuss.tchncs.de 3 points 1 year ago* (last edited 1 year ago) (1 children)

a discussion can have multiple, separate threads with branching topics, that's what this threaded comment system is specifically made to facilitate

[–] nudnyekscentryk@szmer.info 0 points 1 year ago (1 children)

okay, let's rethread how we got here:

OP: Spanish girls report AI porn of them circulating

parent comment: Blockchain could fix this

1-st level reply: Blockchain can't counteract fake porn being created

2-nd level reply: it lets you verify original source

3-rd level reply: if anything it lets you verify integrity between sources

you: if a central authority can't be trusted to verify sources then Blockchain can

me: it's not about verifying provenance of the material but rather its mere existence in the world

you: we can store the fingerprint of the file in a trusted database

me: but this doesn't affect the material's existence

you: you're going off-topic!

me: I am not

you: this conversation can have multiple threads

can you now see how it's you who's off the rails in this conversation? noone ever questioned how blockchain could allow verifying any piece's of media authenticity, but spreading forged, nonconsensual erotica is NOT about proving whether a photo or video in question is authentic; the problem is that people have got tools to do so in the first place, and before a victim can counteract and prove (using blockchain if you will) that a particular photo is a forgery, the damage is done regardless

[–] fiah@discuss.tchncs.de 0 points 1 year ago (1 children)

okay, let’s rethread how we got here:
OP: Spanish girls report AI porn of them circulating
parent comment: Blockchain could fix this

you're missing a step there, buddy. I know, it's hard, let me make it a bit easier for you by drawing a picture:

"blockchain can fix this" was never about preventing AI porn from being spread, it's about the general problem of knowing whether something was authentic, hence their choice to reply to that comment with that article

[–] nudnyekscentryk@szmer.info 1 points 1 year ago (2 children)

Again, for the sixth or whichever time: this has nothing to do with the clou of the problem

[–] papertowels@lemmy.one 2 points 1 year ago

....you're right, it has nothing to do with nudes because it's talking about an entirely different problem of court-admissable evidence.

[–] fiah@discuss.tchncs.de 1 points 1 year ago (1 children)

yes, you're right, it doesn't, because we weren't talking about that. "blockchain" can't do anything to help kids from having AI generated naked pictures of them being spread, and nobody here claimed otherwise

[–] nudnyekscentryk@szmer.info 2 points 1 year ago

yeah but the problem is mere existance of tools allowing pornographic forgery, not verifying whether the material is real or not