this post was submitted on 17 Aug 2023
-3 points (44.4% liked)

libertarianism

396 readers
1 users here now

About us

An open, user owned community for the general disscussion of the libertarian philosophy.

Most people live their own lives by that code of ethics. Libertarians believe that that code should be applied consistently, even to the actions of governments, which should be restricted to protecting people from violations of their rights. Governments should not use their powers to censor speech, conscript the young, prohibit voluntary exchanges, steal or “redistribute” property, or interfere in the lives of individuals who are otherwise minding their own business.

Source: https://www.libertarianism.org/essays/what-is-libertarianism

Rules

1. Stay on topicWe are a libertarian community. There are no restrictions regarding different stances on the political spectrum, but all posts should be related to the philosophy of libertarianism.

2. Be polite to others and respects each others opinions.Be polite to others and respects each others opinions. We don't want any form of gatekeeping or circlejerk culture here.

3. Stay constructive and informationalIn general, all types of contributions are allowed, but the relevance to this community must always be evident and presented openly by the contributor. Posts that do not meet these requirements will be removed after a public warning. Also remember to cite you sources!

4. Use self-moderation measures first before reporting.This community is fundamentally built upon freedom of speech. Since everyone understands libertarianism differently and we do not want to exclude any kind of content a priori, we appeal to the individual users to block/mute posts or users who do not meet their requirements. Please bear this in mind when filing a report

founded 1 year ago
MODERATORS
 

This is more of a 2 part question. Should child porn that does not include a real child be illegal? If so, who is being harmed by it?

The other question is; does giving a pedophile access to "imitation" children give them an outlet for their desire, so they won't try to engage with real children, or does it just reinforce their desire, thus helping them to rationalize their behavior and lead to them being more encouraged to harm real children?

I've heard psychologists discuss both sides, but I don't think we have any real life studies to go off of because the technology is so new.

I'm just curious what the other thought out there are from people who are more liberty minded.

you are viewing a single comment's thread
view the rest of the comments
[–] MomoTimeToDie@sh.itjust.works -1 points 1 year ago (1 children)

So what if someone starts selling it? There still isn't an actual victim involved, so why should people be effectively punished for it (yes, I consider extra government scrutiny and/or surveillance beyond the norm to be a punishment)? Should we force everyone who buys a gun to be under extra surveillance since they might break some other law at some other point in time? Install government-mandated GPS trackers in every car to monitor traffic violations?

[–] LouNeko@lemmy.world 1 points 1 year ago (1 children)

Here's a very simple but likely scenario. Somebody who's keen in the AI field feeds his model a bunch of pictures from adult websites. A lot of different pictures from random actresses on PH for example. This already may cause copyrighting issues since none of the women explicitly agreed for their pictures to be used that way, but thats beside the point. Good, now the model knows what porn is. Now that person takes the child pictures of their friends Facebook, focusing only on one child. Generating porn images will now heavily resemble that one child.\

If the model is trained well enough to generate convincing images, how is this a victimless crime?

Right now there is no way to reliably determine if an image is generated or geniune, and the quality of the generated images will only increase with time. We can't simply rely on the kindness of a persons hearth to watermark all the images as AI generated. And even if the images are clearly marked as fake, nothing stops others from using the images maliciously against the child and their family anyways. This isn't a hypothetical, this is actually happening right now, hopefully less with children but definitely with celebrities that have a lot of public images available.

The person generating their own porn won't necessarily go out of their way to insure anonymity of their generated images. Just like I and many others are often times interested in a specific adult actress/actor because they represent features we are attracted to, I'd expect that pedophiles are most likely also interested in specific children. This sort of negates the "no victim" notion. While yes there is no actual harm done to the victim, the consequences will likely still affect them and their family mentally and financially for the rest of their life.

Thats also the reason why we have joyride laws. Nobody is allowed to just get in your running car and go for a joyride, even though they fill up the tank at the end and bring it back in perfect condition. Technically no harm was done to you, but what if you had an important appointment that you now missed, who would be liable? Eventualities are always something that laws have to consider, because they have to serve as a deterrent to the actual act.

[–] MomoTimeToDie@sh.itjust.works -1 points 1 year ago (1 children)

In the first case you gave, the fact that it's a child is hardly the relevant aspect as much as publishing false and misleading imagery of someone. At least to me, the problem with children being involved in sexual things is that children can't properly give consent, and since we're looking at a situation without consent (regardless of the age of the person), it's not something that would change if a kid is involved, whether you think it should be legal or not.

Personally, I lean towards the idea that it should be legal since I don't support the idea that someone "owns" their own image, and that so long as it isn't being presented as true information, which would be defamation, people are free to make whatever content they like featuring someone's image, even if the subject doesn't like it.

Regarding the example of joyriding, there is harm done. The joyrider deprived me of my rightfully owned property for some period of time, and used it against my interests. That's a specific and provable harm inherent to the crime. This is the entire principle behind the concept of "conversion". Even if you rightfully have possession of something I own, it's still illegal for you to use it in a manner I have not approved of.

[–] LouNeko@lemmy.world 1 points 1 year ago (1 children)

Personally, I lean towards the idea that it should be legal since I don’t support the idea that someone “owns” their own image, and that so long as it isn’t being presented as true information, which would be defamation, people are free to make whatever content they like featuring someone’s image, even if the subject doesn’t like it.

I guess this is where our opinions differ, because I lean towards the contrary.

If you rephrase:

The joyrider deprived me of my rightfully owned property for some period of time, and used it against my interests.

To:

The deepfaker deprived me of my rightfully owned property for some period of time, and used it against my interests.

under consideration that I see images as intellectual property, you can see where my approach to this problem came from and why I specifically used joyriding as a fitting example.

[–] MomoTimeToDie@sh.itjust.works -2 points 1 year ago (1 children)

True. Our difference in opinion largely stems from how we view intellectual property. Personally, I believe that intellectual property should be extremely limited in scope, such that it only amounts to a limited ability for distribution of works.

[–] LouNeko@lemmy.world 1 points 1 year ago

I've already had this debate once about similar topic regarding AI. There are certainly very good arguments for both points of view (especial when it comes to music, I'm more on your side). I'm ready to agree to disagree.