this post was submitted on 15 Apr 2024
488 points (100.0% liked)

Technology

37719 readers
78 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Credit to @bontchev

you are viewing a single comment's thread
view the rest of the comments
[–] flashgnash@lemm.ee 8 points 7 months ago (3 children)

I'm still of the opinion all of these viewpoints should be heard out at least once even if you dismiss them immediately

[–] Cube6392@beehaw.org 39 points 7 months ago (1 children)

The problem with that is that bad faith actors engage in bad faith arguments for a reason. They just want a few people to hear them. It doesn't matter that the majority of people who hear them see through their lies. It matters that they reach that small audience. To let that small audience know they're not alone. The goal is to activate, engage, and coalesce that small audience. This is what the alt-right does. This is what they've done since the 1920s. We have 100 years of evidence that you can't just "Hear out" the Nazis' opinions without harm coming to real, legitimate people. The best way to deal with bad faith actors is to deplatform them before they've achieved a platform

[–] off_brand_@beehaw.org 18 points 7 months ago

Also, it's cheap to speak total bullshit, but it takes time, effort, and energy, to dispel it. I can say the moon is made of cheese, you can't disprove that. And you can go out and look up an article about the samples of moon rock we have and the composition, talk about the atmosphere required to give rise to dairy producing animals and thus cheese.

And I can just come up with some further bullshit that'll take another 30 minutes to an hour to debunk.

If we gave equal weight to every argument, we'd spend our lives mired in fact-checking hell holes. Sometimes, you can just dismiss someone's crap.

[–] Silentiea@lemmy.blahaj.zone 22 points 7 months ago (1 children)

A viewpoint being controversial isn't enough of a reason to dismiss or deplatform it. A viewpoint being completely unsupported (by more than other opinions), especially one that makes broad, unfalsifiable claims is worth dismissing or deplatforming.

Disinformation and "fake news" aren't legitimate viewpoints, even if some people think they are. If your view is provably false or if your view is directly damaging to others and unfalsifiable, it's not being suppressed for being controversial, it's being suppressed for being wrong and/or dangerous.

[–] flashgnash@lemm.ee 1 points 7 months ago (2 children)

I'm not sure a view or opinion can be correct or incorrect though except by general consensus

Absolutely things being presented as facts that are just incorrect should be blown out of the water immediately but everyone's entitled to their opinion whether it's well founded or not imo, censoring that's just gonna drive them into echo chambers where they'll never get the opportunity for someone to change their mind

[–] Silentiea@lemmy.blahaj.zone 5 points 7 months ago

A lot of opinions are or are about testable questions of fact. People have a right to hold the opinion that "most trans women are just male predators," but it's demonstrably false, and placing that statement, unqualified, in a list of statements about trans people is probably what the authors of this ai were hoping it would do.

[–] Silentiea@lemmy.blahaj.zone 4 points 7 months ago

censoring that's just gonna drive them into echo chambers

Also, we're not talking about censoring the speech of individuals here, we're talking about an ai deliberately designed to sound like a reliable, factual resource. I don't think it's going to run off to join an alt right message board because it wasn't told to do any "both-sides-ing"

[–] jkrtn@lemmy.ml 19 points 7 months ago

No thanks. There are too many delusional morons that hear it and like it. Society has heard it far more than once and instead of being dismissed immediately idiots are trying to make white supremacist robots repeat it.