this post was submitted on 15 Oct 2024
1019 points (99.5% liked)

Technology

59440 readers
3553 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Wayback Machine back in read-only mode after DDoS, may need further maintenance.

you are viewing a single comment's thread
view the rest of the comments
[–] WaterSword@discuss.tchncs.de 10 points 1 month ago (1 children)

The thing is sometimed articles must be removed from IA (copyright (I disagree with that one) or when information is leaked that could threaten lives), with a blockchain this would be impossible

[–] tehmics@lemmy.world 4 points 1 month ago (2 children)

this would be impossible

Perfect.

I'd be interested in seeing real examples where lives are threatened. I find it unlikely that the internet archive would be the exclusive arbiter of so-called deadly information

[–] WaterSword@discuss.tchncs.de 9 points 1 month ago (1 children)

There was an actual example where a journalistic article about afghanistan accidentally leaked names of some sources and people who helped westerners in afghanistan, which did actually endanger those people’s lives.

[–] tehmics@lemmy.world 2 points 1 month ago (1 children)

If they're leaked, they're leaked. The archive doesn't change that one way or the other

[–] douglasg14b@lemmy.world 0 points 1 month ago* (last edited 1 month ago) (1 children)

Gotcha so you actually stated your previous question in bad faith as you had no interest in the answer to begin with.

[–] tehmics@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

No. The archive of it isn't doing the dangerous part. The info was already out there and the bad actor who would do something malicious would get that info from the same place the archive did. I need you to show how the archival of information that was already released leads to a dangerous situation that didn't already exist.

[–] brbposting@sh.itjust.works 3 points 1 month ago (1 children)

I thought of something but I don’t know if it’s a good example.

Here’s the hypothetical:

A criminal backs up a CSAM archive. Maybe the criminal is caught, heck say they’re executed. Pedos can now share the archive forever over encrypted messengers without fear of it being deleted? Not ideal.

[–] tehmics@lemmy.world 2 points 1 month ago* (last edited 1 month ago)

Yeah this is a hard one to navigate and it's the only thing I've ever found that challenges my philosophy on the freedom of information.

The archive itself isn't causing the abuse, but CSAM is a record of abuse and we restrict the distribution not because distribution or possession of it is inherently abusive, but because the creation of it was, and we don't want to support an incentive structure for the creation of more abuse.

i.e. we don't want more pedos abusing more kids with the intention of archival/distribution. So the archive itself isn't the abuse, but the incentive to archive could be.

There's also a lot of questions with CSAM in general that come up about the ethics of it in that I think we aren't ready to think about. It's a hard topic all around and nobody wants to seriously address it beyond virtue signalling about how bad it is.

I could potentially see a scenario where the archival could be beneficial to society similar to the FBI hash libraries Apple uses to scan iCloud for CSAM. If we throw genAI at this stuff to learn about it, we may be able to identify locations, abusers and victims to track them down and save people. But it would necessitate the existence of the data to train on.

I could also see potential for using CSAM itself for psychotherapy. Imagine a sci-fi future where pedos are effectively cured by using AI trained on CSAM to expose them to increasingly mature imagery, allowing their attraction to mature with it. We won't really know if something like that is possible if we delete everything. It seems awfully short sighted to me to delete data no matter how perverse, because it could have legitimate positive applications that we haven't conceived of yet. So to that end, I do hope some 3 letter agencies maintain their restricted archives of data for future applications that could benefit humanity.

All said, I absolutely agree that the potential of creating incentives for abusers to abuse is a major issue with immutable archival, and it's definitely something that we need to figure out, before such an archive actually exists. So thank you for the thought experiment.