Read a book by yourself or join a cult where people will pat you on the back for being Really Smart™ for writing long blog posts?
SneerClub
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
adderall
epistemic status: 20 milligrams
How I sorta think about it, which might be a bit circular. I think the long content is a gullibility filter of two kinds. First, it selects for people who are willing to slog through all of it and eat it up, and defend their choice in doing so. Second, it’s gonna select people who like the broad strokes ideas, who don’t want to read all the content, but are able to pretend as if they had.
The first set of people are like scientologists sinking into deeper and deeper levels of lore. The second group are the actors in the periphery of scientology groups trying to network.
I think the long content is a gullibility filter
Like those "one weird trick" videos that go on and on without displaying playback controls. Only the rubes get to the end.
Yud writing about math is the worst. You get your autodidact problems, because he's never been tested on actually doing calculations. He's always graded his own homework, as it were; all his experience is in rhetorically weaseling out of his mistakes, instead of learning from the red pen. Then you get all the problems that come from "splurging a first draft" out upon his fandom. They miss the mistakes among his meanderings. Quite likely, they lack the experience to detect them, but the beigeness of his prose helps to obscure them anyway. The fan will interpret any confusion as being their own fault, not Yud's, or just dismiss any lack of clarity because the feeling of being special feels so good. So, even if Yud were inclined to learn from meaningful criticism, he's not getting any.
Struggling through Yud's attempt at explaining a basic calculation in quantum mechanics is like reading algebra problems from before algebraic notation was invented.
When the cube with the cose beside it
Equates itself to some other whole number,
Find two others, of which it is the difference.Hereafter you will consider this customarily
That their product always will be equal
To the third of the cube of the cose net.Its general remainder then
Of their cube sides, well subtracted,
Will be the value of your principal unknown.
The fan will interpret any confusion as being their own fault, not Yud’s, or just dismiss any lack of clarity because the feeling of being special feels so good.
god, so many bloggers and futurists exploit the shit out of this
yud failing to understand and teach math reminds me of when he and a bunch of supposed AI researchers (more than one of whom gets their paycheck from OpenAI) decided that chatgpt was great at chess based on extremely superficial evidence (they think it’s an AGI that just needs more data, it beat them (because they don’t know how to play chess but they like to pretend they do), and it played well against the famous games they replayed against it), but when you actually check with a chess engine or just by playing its moves against someone who knows how to play chess, you quickly find out it obviously knows nothing about the game and is either replaying moves from its training set or generating nonsensical moves that happen to mostly be in chess notation
"Knight to e12."
"Queen takes Vampire Bishop."
That is hilarious, thank you. Reminds me of seeing people talking about how much ChatGPT was helping them with maths homework while it was giving me a proof that the smallest negative number is -1.
The Sequences are inherently short, there are just massively many of them - the fact that each one is woefully inadequate to its own aims is eclipsed by the size of the overall task.
The longer stuff, Siskind included, is precisely what you get from people with short attention spans who find it takes longer than that to justify the point that they want to make themselves. There’s no structure, no overarching thematic or compositional coherence to each piece, just the unfolding discovery that more points still need to be made. This makes it well-suited for limited readers who think their community’s style longform writing is special, but don’t trust it in authors who have worked on technique (literary technique is suspicious - splurging a first draft onto the internet marks the writer out as honest: rationalism is a 21st century romantic movement, not a scholastic one).
Besides which, the number of people who “read all of” any of these pieces is significantly lower than the number of people who did so.