Now I really wonder how much EA donation money went into making this.
Architeuthis
Imo because the whole topic of superforecasters and prediction markets is both undercriticized and kaleidoskopically preposterous in a way that makes it feel like you shouldn't broach the topic unless you are prepared to commit to some diatribe length posting.
Which somebody should, it's a shame there is yet no one single place you can point to and say "here's why this thing is weird and grifty and pretend science while striclty promoted by the scientology of AI, and also there's crypto involved".
Maybe he's the guy who goes to the orgy just to hold hands.
In yet another part of the article:
She had found herself in both an intellectual community and a demimonde, with a running list of inside jokes and in-group norms. Some people gave away their savings, assuming that, within a few years, money would be useless or everyone on Earth would be dead.
More totally normal things in our definitely not a cult community.
I wonder how much of that family fortune has found its way into EA coffers by now.
Basically their only hope is that an AI under their control takes over the world.
They are pretty dominant in the LLM space and are already having their people fast tracked into positions of influence, while sinking tons of cash into normalizing their views and enforcing their terminology.
Even though they aren't trying to pander to religious americans explicitly, their millenialism with the serial numbers filed off worldview will probably feel familiar and cozy to them.
Wasn't he supposed to be a romantic asexual at some point?
This was such a chore to read, it's basically quirk-washing TREACLES. This is like a major publication deciding to take an uncritical look at scientology focusing on the positive vibes and the camaraderie, while stark in the middle of operation snow white, which in fact I bet happened a lot at the time.
The doomer scene may or may not be a delusional bubble—we’ll find out in a few years
Fuck off.
The doomers are aware that some of their beliefs sound weird, but mere weirdness, to a rationalist, is neither here nor there. MacAskill, the Oxford philosopher, encourages his followers to be “moral weirdos,” people who may be spurned by their contemporaries but vindicated by future historians. Many of the A.I. doomers I met described themselves, neutrally or positively, as “weirdos,” “nerds,” or “weird nerds.” Some of them, true to form, have tried to reduce their own weirdness to an equation. “You have a set amount of ‘weirdness points,’ ” a canonical post advises. “Spend them wisely.”
The weirdness is eugenics and the repugnant conclusion, and abusing bayes rule to sidestep context and take epistimological shortcuts to cuckoo conclusions while fortifying a bubble of accepted truths that are strangely amenable to allowing rich people to do whatever the hell they want.
Writing a 7-8000 word insider expose on TREACLES without mentioning eugenics even once throughout should be all but impossible, yet here we are.
Yeah, a lot of these TESCREAL exposés seem to lean on the perceived quirkiness while completely failing to convey how deeply unserious their purported scientific and philosophical footing is, like virgin tzatziki with impossible gyros unserious.
Something like a weekly general topic thread would work great for this I think.
I read
and immediately thought someone should introduce PZ Meyers to rat/EA as soon as possible.
Turns out he's aware of them since at least 2016:
Are these people for real?
More recently, it seems that as an evolutionary biologist he apparently has thoughts on the rat concept of genetics: The eugenicists are always oozing out of the woodwork
FWiW I used to read PZM quite a bit before he pivoted to doing youtube videos which I don't have the patience for, and he checked out of the new atheist movement (such as it was) pretty much as soon as it became evident that it was gradually turning into a safe space for islamophobia and misogyny.