this post was submitted on 12 Mar 2024
31 points (100.0% liked)

SneerClub

989 readers
3 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

The New Yorker has a piece on the Bay Area AI doomer and e/acc scenes.

Excerpts:

[Katja] Grace used to work for Eliezer Yudkowsky, a bearded guy with a fedora, a petulant demeanor, and a p(doom) of ninety-nine per cent. Raised in Chicago as an Orthodox Jew, he dropped out of school after eighth grade, taught himself calculus and atheism, started blogging, and, in the early two-thousands, made his way to the Bay Area. His best-known works include “Harry Potter and the Methods of Rationality,” a piece of fan fiction running to more than six hundred thousand words, and “The Sequences,” a gargantuan series of essays about how to sharpen one’s thinking.

[...]

A guest brought up Scott Alexander, one of the scene’s microcelebrities, who is often invoked mononymically. “I assume you read Scott’s post yesterday?” the guest asked [Katja] Grace, referring to an essay about “major AI safety advances,” among other things. “He was truly in top form.”

Grace looked sheepish. “Scott and I are dating,” she said—intermittently, nonexclusively—“but that doesn’t mean I always remember to read his stuff.”

[...]

“The same people cycle between selling AGI utopia and doom,” Timnit Gebru, a former Google computer scientist and now a critic of the industry, told me. “They are all endowed and funded by the tech billionaires who build all the systems we’re supposed to be worried about making us extinct.”

you are viewing a single comment's thread
view the rest of the comments
[–] Architeuthis@awful.systems 33 points 8 months ago (3 children)

This was such a chore to read, it's basically quirk-washing TREACLES. This is like a major publication deciding to take an uncritical look at scientology focusing on the positive vibes and the camaraderie, while stark in the middle of operation snow white, which in fact I bet happened a lot at the time.

The doomer scene may or may not be a delusional bubble—we’ll find out in a few years

Fuck off.

The doomers are aware that some of their beliefs sound weird, but mere weirdness, to a rationalist, is neither here nor there. MacAskill, the Oxford philosopher, encourages his followers to be “moral weirdos,” people who may be spurned by their contemporaries but vindicated by future historians. Many of the A.I. doomers I met described themselves, neutrally or positively, as “weirdos,” “nerds,” or “weird nerds.” Some of them, true to form, have tried to reduce their own weirdness to an equation. “You have a set amount of ‘weirdness points,’ ” a canonical post advises. “Spend them wisely.”

The weirdness is eugenics and the repugnant conclusion, and abusing bayes rule to sidestep context and take epistimological shortcuts to cuckoo conclusions while fortifying a bubble of accepted truths that are strangely amenable to allowing rich people to do whatever the hell they want.

Writing a 7-8000 word insider expose on TREACLES without mentioning eugenics even once throughout should be all but impossible, yet here we are.

[–] blakestacey@awful.systems 15 points 8 months ago* (last edited 8 months ago)

Inside the Strange World of the Uwu Smol Beans: An Exposé of a Quirky Community with No Racists Whatsoever

[–] swlabr@awful.systems 13 points 8 months ago* (last edited 8 months ago) (2 children)

quirk-washing TREACLES

I can’t wait to be quirk-washed, I’m ready to hang up my pick-me hat and let the new yorker do the work for me

[–] froztbyte@awful.systems 8 points 8 months ago* (last edited 8 months ago) (3 children)

speaking of, saw this this morning: https://www.vox.com/future-perfect/2024/2/13/24070864/samotsvety-forecasting-superforecasters-tetlock

2 within a handful of days trying to reputation-wash after they got their filthy little selves exposed last year as the shitgremlins they are. hopefully it's just a coincidence in timing, but guess we'll have to see

[–] TinyTimmyTokyo@awful.systems 10 points 8 months ago (1 children)

I'm probably not saying anything you didn't already know, but Vox's "Future Perfect" section, of which this article is a part, was explicitly founded as a booster for effective altruism. They've also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it's just taken for granted that he's some kind of visionary genius.

[–] froztbyte@awful.systems 3 points 8 months ago

yep aware. didn't care too much about the article itself, was more observing the coincidence in timing. but you have a point there with the names, I really should make that a standing mental ban

[–] swlabr@awful.systems 5 points 8 months ago (1 children)

Had to stop reading that. My eyes were rolling too much.

[–] froztbyte@awful.systems 7 points 8 months ago

uwu smol-bean number starers, lovable little group of misfits from checks notes fucking RAND

[–] gerikson@awful.systems 3 points 8 months ago (1 children)

What happened to Samotsvety last year? I missed that .

[–] froztbyte@awful.systems 5 points 8 months ago

I meant more the general state of the things in the TREACLES umbrella catching unfavourable public attention over the last while

[–] sc_griffith@awful.systems 7 points 8 months ago

you gotta be white cis and loathsome or they won't do it

[–] Amoeba_Girl@awful.systems 12 points 8 months ago

God I always forget about the repugnant conclusion. It's baffling that it's being taken as anything but a fatal indictment of utilitarianism.