swlabr

joined 2 years ago
[–] swlabr@awful.systems 6 points 1 year ago (1 children)

Ok, so to be clear, I would (perhaps naively) prefer it if we didn't have charities/NGOs and that governments would handle solving problems and helping people entirely. Of course, this is reductive; there are probably plenty of spaces where NGOs and charities are better suited for approaching some issues.

That being said, while money (or a lack thereof) is the main issue in solving many problems, you still need all kinds of work to make it effective. In the case of malaria prevention, a cause EA deems to be cost-effective, you still need to pay staff to carry out logistics to deliver whatever nets or vaccines you buy with money. You wouldn't want someone incompetent at the helm; that could cause your cost-effectiveness to go down. And how do you incentivize competent people to stay in leadership positions? There are plenty of ways, but executive bonuses will be at the top of that list.

Anyway, my main issue with EA has gotta be how it launders false morality and money into morality. The false morality is the X-risk shit. The money is the money from working in tech.

[–] swlabr@awful.systems 7 points 1 year ago (1 children)

smug particles

Goddamn rats and their smug particles! (jk)

[–] swlabr@awful.systems 9 points 1 year ago

Thanks for bringing up the dogwhistles. We haven’t talked about the dog whistles enough here. My fave has gotta be him bringing up the school shooting one.

[–] swlabr@awful.systems 15 points 1 year ago (1 children)

Scott: "Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I'll bring up 9/11"

Empty room: "..."

"And I'll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I'll show how that's the same as when someone big in EA does something bad."

"..."

"Especially since it's common for people to, after a big scandal, try and push their agenda to improve things. We definitely don't want that."

"..."

"Also, on average there's less SA in STEM, and even though there is still plenty of SA, we don't need to change anything, because averages."

"..."

"Anyway, time for dexy no. 5"

[–] swlabr@awful.systems 4 points 1 year ago

tbh, weird energy coming from your responses, I am now disengaging.

[–] swlabr@awful.systems 8 points 1 year ago (2 children)

true, just, something you don’t “necessarrily agree with” is a weird hill to die on

[–] swlabr@awful.systems 13 points 1 year ago

Oh, we value logic, you’re just bad at it.

[–] swlabr@awful.systems 13 points 1 year ago (3 children)

I suspect a large portion of people in EA leadership were already on the latter train and posturing as the former. The former is actually kinda problematic in its own way! If a problem was solvable purely by throwing money at it, then what is the need for a charity at all?

[–] swlabr@awful.systems 5 points 1 year ago

Many on the Dudesy Reddit believe that it’s entirely made up, with some suggesting that even the Carlin special was written by Kultgen. I definitely think that generative AI was used here, because even though it’s crap, there is somehow a great deal of it.

Maybe it's time for me to see the world and update my priors on this but I haven't seen evidence of a generative AI constructing a joke. Based on that evidence, if the special (which I have not seen and refuse to see) contains jokes, I would imagine it was mostly written by human, with maybe some assistance to pad it out to an hour of material. Either way, fuck the whole thing.

[–] swlabr@awful.systems 6 points 1 year ago (5 children)

how is it that so many folks who watched this garbage and liked it found our last thread on lemmy of all places and felt strongly enough about the Carlin special to post and occasionally lie about it? even assuming the new sort gave it a gigantic boost as a popular thread on a small instance, this shit doesn’t make sense to me.

I think you know the answer, my guy. AI flunkies will search for and latch onto any opportunity to hype AI, especially if its a niche online forum. I mean it definitely wasn't Dudesy fans, they are in short supply.

[–] swlabr@awful.systems 12 points 1 year ago (10 children)

Hey guys look it's the Scott whisperer, Mr. Beandog. Let's see what he's got for us today:

I’m not a fanboy

sure

or necessarrily agree with his argument

surely then, you wouldn't feel the need to 'splain it

but you’re seriously missing the point of what he’s trying to say.

oh ok

He’s just talking about how big, mediapathic events can unduly influence people’s perception of probability and risk

No, that isn't what he is saying, actually.

He doesn’t need actual real world numbers to show how this works, he’s just demonstrating how the math works and how the numbers change

He does, actually. You can't make fake mathematical statements about the real world and expect me to just buy your argument. He is demonstrating how the math hypothetically works in a scenario where he cooks the numbers. There is no reason why one should extrapolate that to the real world.

He isn’t trying to convince stupid people of anything, they aren’t his target audience and they will never think this way.

Oh ok. prior updated. Coulda sworn his target audience was morons.

[–] swlabr@awful.systems 17 points 1 year ago* (last edited 1 year ago) (5 children)

Scott is saying essentially that "one data point doesn't influence the data as a whole that much" (usually true)... "so therefore you don't need to change your opinions when something happens" which is just so profoundly stupid. Just so wrong on so many levels. It's not even correct Bayesianism!

(if it happens twice in a row, yeah, that’s weird, I would update some stuff)

???????? Motherfucker have you heard of the paradox of the heap? What about all that other shit you just said?

What is this really about, Scott???

Do I sound defensive about this? I’m not. This next one is defensive. [line break] I’m part of the effective altruist movement.

OH ok. I see now. I mean I've always seen, really, that you and your friends work really hard to come up with ad hoc mental models to excuse every bit of wrongdoing that pops up in any of the communities you're in.

You definitely don’t get this virtue by updating maximally hard in response to a single case of things going wrong. [...] The solution is not to update much on single events, even if those events are really big deals.

Again, this isn't correct Bayesian updating. The formula is the formula. Biasing against recency is not in it. And that's just within Bayesian reasoning!

In a perfect world, people would predict distributions beforehand, update a few percent on a dramatic event, but otherwise continue pursuing the policy they had agreed upon long before.

YEAH BECAUSE IT'S A PERFECT WORLD YOU DINGUS.

view more: ‹ prev next ›