this post was submitted on 05 Jan 2024
4 points (100.0% liked)

SneerClub

989 readers
3 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
(page 2) 16 comments
sorted by: hot top controversial new old
[–] dgerard@awful.systems 1 points 10 months ago* (last edited 10 months ago) (7 children)

Someone asked in the comments that Zack clarify wtf the claim is, and Zack posted this abstract:

Does this help? (159 words and one hyperlink to a 16-page paper)

Empirical Claim: late-onset gender dysphoria in males is not an intersex condition.

Summary of Evidence for the Empirical Claim: see "Autogynephilia and the Typology of Male-to-Female Transsexualism: Concepts and Controversies" by Anne Lawrence, published in European Psychologist. (Not by me!)

Philosophical Claim: categories are useful insofar as they compress information by "carving reality at the joints"; in particular, whether a categorization makes someone happy or sad is not relevant.

Sociological Claim: the extent to which a prominence-weighted sample of the rationalist community has refused to credit the Empirical or Philosophical Claims even when presented with strong arguments and evidence is a reason to distrust the community's collective sanity.

Caveat to the Sociological Claim: the Sociological Claim about a prominence-weighted sample of an amorphous collective doesn't reflect poorly on individual readers of lesswrong.com who weren't involved in the discussions in question and don't even live in America, let alone Berkeley.

so this is a two-hour post about Zack's arguments with unnamed Bay Area rationalists. Today, in posts that should have been a Discord chat.

(the paper he names is a Blancharding ramble)

[–] swlabr@awful.systems 1 points 10 months ago

Sociological Claim: the extent to which a prominence-weighted sample of the rationalist community has refused to credit the Empirical or Philosophical Claims even when presented with strong arguments and evidence is a reason to distrust the community’s collective sanity.

Zack my guy you are so fucking close. Also just fucking leave.

load more comments (6 replies)
[–] sailor_sega_saturn@awful.systems 1 points 10 months ago

Rationalist Civil War was my least favorite Marvel movie.

[–] Amoeba_Girl@awful.systems 0 points 10 months ago (1 children)

My dayjob performance had been suffering for months. The psychology of the workplace is ... subtle. There's a phenomenon where some people are vastly more productive than others and everyone knows it, but no one is cruel enough to make it common knowledge. This is awkward for people who simultaneously benefit from the culture of common-knowledge-prevention allowing them to collect the status and money rents of being a $150K/year software engineer without actually performing at that level, who also read enough Ayn Rand as a teenager to be ideologically opposed to subsisting on unjustly-acquired rents rather than value creation. I didn't think the company would fire me, but I was worried that they should.

My goodness is there anything about this person's outlook that isn't profoundly sad.

[–] pja@awful.systems 0 points 10 months ago* (last edited 10 months ago) (1 children)

They’re up to their armpits in a pit of self-loathing of their own making & unable to take any of the proffered help because their self-imposed ethical system doesn’t permit it.

Profoundly sad is exactly what it is.

[–] Amoeba_Girl@awful.systems 1 points 10 months ago

Crucially, a misanthropic and underbaked ethical system based on principles that are either completely removed from reality or demonstrably wrong. And which in the end really amounts to a great deal of ..... rationalisation :o

[–] scruiser@awful.systems 0 points 10 months ago (10 children)

The thing that gets me the most about this is they can't imagine that Eliezer might genuinely be in favor of inclusive language, and thus his use of people's preferred pronouns must be a deliberate calculated political correctness move and thus in violation of the norms espoused by the sequences (which the author takes as a given the Eliezer has never broken before, and thus violating his own sequences is some sort of massive and unique problem).

To save you all having to read the rant...

—which would have been the end of the story, except that, as I explained in a subsequent–subsequent post, "A Hill of Validity in Defense of Meaning", in late 2018, Eliezer Yudkowsky prevaricated about his own philosophy of language in a way that suggested that people were philosophically confused if they disputed that men could be women in some unspecified metaphysical sense.

Also, bonus sneer points, developing weird terminology for everything, referring to Eliezer and Scott as the Caliphs of rationality.

Caliphate officials (Eliezer, Scott, Anna) and loyalists (Steven) were patronizingly consoling me

One of the top replies does call this like it is...

A meaningful meta-level reply, such as "dude, relax, and get some psychological help" will probably get me classified as an enemy, and will be interpreted as further evidence about how sick and corrupt is the mainstream-rationalist society.

load more comments (10 replies)
load more comments
view more: ‹ prev next ›