this post was submitted on 24 May 2025
548 points (97.4% liked)
Technology
70285 readers
3446 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There's absolutely something to be said for trying to ensure that people don't have access to porn as kids, but that doesn't come from what these legal battles inevitably want to impose, which is ID check requirements that create a massive treasure trove of data for attackers to target to steal IDs, blackmail individuals, and violate people's privacy, while adding additional costs for porn sites that will inevitably lead to predatory monetization, such as more predatory ads.
The problem is that parents are offloading their own responsibility and education off themselves and schools, and instead placing an unworkable burden onto the sites that host and distribute pornographic content.
We know that when you provide proper sex education, talk to kids about how to safely consume adult content without risking their health, safety, and while setting realistic expectations, you tend to get much better outcomes.
If there's one thing I think most people are very aware of, it's that the more you try and hide something from kids, the more they tend to try and resist that, and find it anyways, except without any proper education or safeguards.
It's why abstinence only education tends to lead to worse outcomes than sex education, even though on the surface, you're "exposing" kids to sexually related materials.
This doesn't mean we should deliberately expose kids to porn out of nowhere, remove all restrictions or age checks, etc, but it does mean that we can, for example:
Kids won't simply stop viewing porn if you implement age gates. Kids are smart, they find their way around restrictions all the time. If we can't reasonably stop them without producing a whole host of other extremely negative consequences, then the best thing we can do is educate them on how to not severely risk their own health.
It's not perfect, but it's better than creating massive pools of private data, perverse financial incentives, and pushing people to more fringe sites that do even less to comply with the law.
I understand and agree with what you’re saying. I think people should need licenses to have kids, but that’s a different story.
The conflict that this often boils down to is that the digital world does not emulate the real world. If you want to buy porn in the real world, you need ID, but online anything goes. I love my online anonymity just as much as everybody else, but we’ll eventually need to find some hybrid approach.
We already scan our faces on our phones all the time, or scan our finger on our computer. How about when you want to access a porn site you have to type in a password or do some biometric credential?
I think 50% or more of the resistance of restricting porn is really just that people really love porn and are ashamed of what they view. There’s a whole other social psychology that needs to change in regards to how we view sex and I agree with more education.
The problem is that because the internet is fundamentally different from the real world, it has its own challenges that make some of the things we do in the real world unfeasible in the digital world. showing an ID to a clerk at a store doesn't transmit your sensitive information over the internet to/through an unknown list of companies, who may or may not store it for an undetermined amount of time, but doing so on the internet essentially has to do so.
While I do think we should try and prevent kids from viewing porn at young ages, a lot of the mechanisms proposed to do so are either not possible, cause many other harms by their existence that could outweigh their benefits, or are trivially bypassed.
Those systems are fundamentally different, even though the interaction is the same, so implementing them in places like porn sites carries entirely different implications.
For example, (and I'm oversimplifying a bit here for time's sake) a biometric scan on your phone is just comparing the scan it takes each time with the hash (a processed version) of your original biometric scan during setup. If they match, the phone unlocks.
This verification process does nothing to verify if you're a given age, just that your face/fingerprint is the same as during setup. It also never has to transmit or store your biometrics to another company. It's always on-device.
Age verification online for something like porn is much more complex. When you're verifying a user, you have to verify:
This all carries immense challenges. It's fundamentally incompatible with user privacy. Any step in this process could involve processing data about someone that could allow for:
This also doesn't include the fact that most of these can simply be bypassed by anyone willing to put in even a little effort. If you can buy an ID or SSN online for less than a dollar, you'll definitely be able to buy an age verification scan video, or a photo of an ID.
Plus, for those unwilling to directly bypass measures on the major sites, then if only the sites that actually fear government enforcement implement these measures, then people will simply go to the less regulated sites.
In fact, this is a well documented trend, that whenever censorship of any media happens, porn or otherwise, viewership simply moves to noncompliant services. And of course, these services can be hosting much worse content than the larger, relatively regulatory-compliant businesses, such as CSAM, gore, nonconsensual recordings, etc.