Soyweiser

joined 1 year ago
[–] Soyweiser@awful.systems 9 points 5 months ago* (last edited 5 months ago) (4 children)

uses my superior vision, no caliper needed

That you felt to point out what you think is a mistake shows me you have an inferior skull. You might even have the blood of the worst people (The Dutch, the creators of the infernal word LOL, Colonizing the sea, and Big Brother (potjandorie nog aan toe!)), in you. :P

[–] Soyweiser@awful.systems 4 points 5 months ago (2 children)

That is not something I would prob do, as I think that is a little bit annoying and spammy. (and I don't like to do things I don't like myself while leaning on 'technically there is no rule against it'. It would also be annoying in a few old subreddits which I used to post in which have nothing to do with all of this (and I don't want to annoy them), so I would to have put more work into whitelisting/blacklisting the various subs. But still thanks, somebody else might want to use it.

So far I have mostly resorted to manually deleting a few posts every now and then so far. (And the few times I still use it nowadays, a few posts per month, I tend to delete the posts eventually already).

I'm also not sure im interesting enough as a poster for people to actually follow me, or that people actually care that much. Tried before the restart to every now and then point towards this instance and I found that process annoying. (So this also means a large percentage of the people now left in sneerclub will be the worst, so it might finally morph into the boogyman the Rationalists always made it out to be).

[–] Soyweiser@awful.systems 19 points 5 months ago (11 children)

I have a big head (I needed to pick the bigger helmets when I went gokarting) so yes, there is a huge relationship between IQ and brain size. Don't mock my chunky noggin please.

[–] Soyweiser@awful.systems 25 points 5 months ago* (last edited 5 months ago) (2 children)

Considering that the idea of the singularity of AGI was the exponential function going straight up, I don't think this persons understands the problem. Lol, LMAO foomed the scorpion.

(Also that is some gross weird eugenics shit).

E: also isn't IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.

Anyway, time to start the lucrative field of HighIQHuman safety research. What do we do if the eugenics superhumans goals don't align with humanity?

[–] Soyweiser@awful.systems 6 points 5 months ago (4 children)
[–] Soyweiser@awful.systems 8 points 5 months ago (4 children)

I have often thought about just nuking most of my reddit posts and have not done so because I'm easily distracted. So I also feel like the latter.

[–] Soyweiser@awful.systems 6 points 5 months ago

Saying “I no longer consent to being in a simulation”

Time to create a big mindfuck for the Rationalists. First have an active account on EA/LW and actively participate. Post this research as a talking point. Then leave one message 'lol just to be sure: I no longer consent to being in a simulation'. And then never touch that account + any related accounts ever again.

[–] Soyweiser@awful.systems 3 points 5 months ago* (last edited 5 months ago)

insanely detailed geek pages. These contain the right answer! But not the specific right answer.

I had a windows problem once, so was looking through all that looking for an answer and saw somebody go 'I need to save my data, which is very important, so how do I fix this problem' and then they got a reply with 'follow these steps with step X making sure they would delete their data'. And this was before the whole era of AI slop so I fear how much worse it has gotten now.

[–] Soyweiser@awful.systems 21 points 5 months ago

"I know not with what GPT-5 will will reply with, but GPT-6 will reply with 'Unfortunately as an language model I Unfortunately as an language model I I I Unfortunately Unfortunately Unfortunately model model model'" Albert GPTstein.

[–] Soyweiser@awful.systems 7 points 5 months ago (1 children)

Fair enough, my bad. No idea it had gotten that bad. But still, I wasn't intending to rag on libgen, just his idea that these things (which also includes education) are free already.

[–] Soyweiser@awful.systems 8 points 5 months ago* (last edited 5 months ago) (4 children)

You are mistaken in my reasoning, I'm saying a person with a well paid university position (which gives him access to money and the university library, which I assume pays for access to their books/papers and doesn't libgen or equivs them) should understand that a lot of training material is indeed not free. This being in addition to the university paying him for his own research, and him prob being pretty annoyed if he was replaced with an iSandberg bot and now was homeless. (This is in addition to what Yud said).

Turns out making papers about replacing the earth with fruit is something Anders-GPT can do perfectly well on its own.

[–] Soyweiser@awful.systems 8 points 5 months ago* (last edited 5 months ago) (6 children)

Somebody got all his books from libgen and it shows.

view more: ‹ prev next ›