this post was submitted on 24 May 2025
548 points (97.4% liked)

Technology

70285 readers
3446 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ArchRecord@lemm.ee 2 points 6 hours ago (1 children)

There's absolutely something to be said for trying to ensure that people don't have access to porn as kids, but that doesn't come from what these legal battles inevitably want to impose, which is ID check requirements that create a massive treasure trove of data for attackers to target to steal IDs, blackmail individuals, and violate people's privacy, while adding additional costs for porn sites that will inevitably lead to predatory monetization, such as more predatory ads.

The problem is that parents are offloading their own responsibility and education off themselves and schools, and instead placing an unworkable burden onto the sites that host and distribute pornographic content.

We know that when you provide proper sex education, talk to kids about how to safely consume adult content without risking their health, safety, and while setting realistic expectations, you tend to get much better outcomes.

If there's one thing I think most people are very aware of, it's that the more you try and hide something from kids, the more they tend to try and resist that, and find it anyways, except without any proper education or safeguards.

It's why abstinence only education tends to lead to worse outcomes than sex education, even though on the surface, you're "exposing" kids to sexually related materials.

This doesn't mean we should deliberately expose kids to porn out of nowhere, remove all restrictions or age checks, etc, but it does mean that we can, for example:

  • Implement reasonable sex education in schools. Kids who have sex ed generally engage in healthier masturbation and sex than kids who don't.
  • Have parents talk with their kids about safe and healthy sex & relationships. It's an awkward conversation, but we know it keeps kids healthier and safer in the long run.
  • Implement a captcha-like system to make it a little more difficult (and primarily, slower and less stimulating) for kids to quickly access porn sites. Requiring certain somewhat higher level math problems to be solved, for example. This doesn't rely on giving up sensitive personal info.

Kids won't simply stop viewing porn if you implement age gates. Kids are smart, they find their way around restrictions all the time. If we can't reasonably stop them without producing a whole host of other extremely negative consequences, then the best thing we can do is educate them on how to not severely risk their own health.

It's not perfect, but it's better than creating massive pools of private data, perverse financial incentives, and pushing people to more fringe sites that do even less to comply with the law.

[–] venusaur@lemmy.world 2 points 5 hours ago (1 children)

I understand and agree with what you’re saying. I think people should need licenses to have kids, but that’s a different story.

The conflict that this often boils down to is that the digital world does not emulate the real world. If you want to buy porn in the real world, you need ID, but online anything goes. I love my online anonymity just as much as everybody else, but we’ll eventually need to find some hybrid approach.

We already scan our faces on our phones all the time, or scan our finger on our computer. How about when you want to access a porn site you have to type in a password or do some biometric credential?

I think 50% or more of the resistance of restricting porn is really just that people really love porn and are ashamed of what they view. There’s a whole other social psychology that needs to change in regards to how we view sex and I agree with more education.

[–] ArchRecord@lemm.ee 3 points 2 hours ago

The conflict that this often boils down to is that the digital world does not emulate the real world. If you want to buy porn in the real world, you need ID, but online anything goes. I love my online anonymity just as much as everybody else, but we’ll eventually need to find some hybrid approach.

The problem is that because the internet is fundamentally different from the real world, it has its own challenges that make some of the things we do in the real world unfeasible in the digital world. showing an ID to a clerk at a store doesn't transmit your sensitive information over the internet to/through an unknown list of companies, who may or may not store it for an undetermined amount of time, but doing so on the internet essentially has to do so.

While I do think we should try and prevent kids from viewing porn at young ages, a lot of the mechanisms proposed to do so are either not possible, cause many other harms by their existence that could outweigh their benefits, or are trivially bypassed.

We already scan our faces on our phones all the time, or scan our finger on our computer. How about when you want to access a porn site you have to type in a password or do some biometric credential?

Those systems are fundamentally different, even though the interaction is the same, so implementing them in places like porn sites carries entirely different implications.

For example, (and I'm oversimplifying a bit here for time's sake) a biometric scan on your phone is just comparing the scan it takes each time with the hash (a processed version) of your original biometric scan during setup. If they match, the phone unlocks.

This verification process does nothing to verify if you're a given age, just that your face/fingerprint is the same as during setup. It also never has to transmit or store your biometrics to another company. It's always on-device.

Age verification online for something like porn is much more complex. When you're verifying a user, you have to verify:

  • The general location the user lives in (to determine which laws you must comply with, if not for the type of verification, then for the data retention and security, and access)
  • The age of the user
  • The reality of the user (e.g. a camera held up to a YouTube video shouldn't verify as if the person is the one in the video)
  • The uniqueness of the user (e.g. that this isn't someone re-licensing the same clip of their face to be replayed directly into the camera feed, allowing any number of people to verify using the same face)
  • And depending on the local regulations, the identity of the user (e.g. name, and sometimes other identifiers like address, email, phone number, SSN, etc)

This all carries immense challenges. It's fundamentally incompatible with user privacy. Any step in this process could involve processing data about someone that could allow for:

  • Blackmail/extortion
  • Data breaches that allow access to other services the person has an account on
  • Being added to spam marketing lists
  • Heavily targeted advertising based on sexual preference
  • Government registries that could be used to target opponents

This also doesn't include the fact that most of these can simply be bypassed by anyone willing to put in even a little effort. If you can buy an ID or SSN online for less than a dollar, you'll definitely be able to buy an age verification scan video, or a photo of an ID.

Plus, for those unwilling to directly bypass measures on the major sites, then if only the sites that actually fear government enforcement implement these measures, then people will simply go to the less regulated sites.

In fact, this is a well documented trend, that whenever censorship of any media happens, porn or otherwise, viewership simply moves to noncompliant services. And of course, these services can be hosting much worse content than the larger, relatively regulatory-compliant businesses, such as CSAM, gore, nonconsensual recordings, etc.