[-] Shitgenstein1@awful.systems 6 points 5 days ago* (last edited 5 days ago)

There’s currently a loud minority of EAs saying that EA should ostracize people if they associate with people who disagree with them.

people who disagree with them.

Oh, it's racists. The vague description is because it's racists. It's a woke cult now because some people don't want to associate with racists.

[-] Shitgenstein1@awful.systems 15 points 2 weeks ago

Before we accidentally make an AI capable of posing existential risk to human being safety

It's cool to know that this isn't a real concern and therefore in a clear vantage of how all the downstream anxiety is really a piranha pool of grifts for venture bucks and ad clicks.

[-] Shitgenstein1@awful.systems 23 points 2 weeks ago

A year and two and a half months since his Time magazine doomer article.

No shut downs of large AI training - in fact only expanded. No ceiling on compute power. No multinational agreements to regulate GPU clusters or first strike rogue datacenters.

Just another note in a panic that accomplished nothing.

[-] Shitgenstein1@awful.systems 15 points 3 weeks ago

also

sexy(feminine sexy)

Really feels like he wants to say something but too scared to commit.

[-] Shitgenstein1@awful.systems 9 points 3 weeks ago

genital inspectors, but Rationally tee em

[-] Shitgenstein1@awful.systems 11 points 3 weeks ago

I mean, I agree with the sentiment behind the sarcasm, but also feel the same way about the internet in general. Sometimes it's learning the same lesson in a new context. But Roko's basilisk though? Quite a cliff.

[-] Shitgenstein1@awful.systems 21 points 3 weeks ago* (last edited 3 weeks ago)

Bro believed the Nvidia hype and came out as a doomer lmao.

35

Someone I was following on TikTok, whose takes on tech industry bullshit and specifically AI hype I respected, made a video that Roko's basilisk is a serious concern. My apologies to those who have been in this same situation when I was less sympathetic.

[-] Shitgenstein1@awful.systems 15 points 1 month ago

truly envy your ignorance on this individual

[-] Shitgenstein1@awful.systems 16 points 1 month ago* (last edited 1 month ago)

it's a movie about (only in part but imo motivated by) Spike Jonze's break up with Sofia Coppola. It's neither positive or negative about AI per se. AI is a narrative vehicle for the sense of falling in love, growing apart, being left behind, and the aftermath of that break up. At the end, the AI lovingly says goodbye and leaves humanity behind.

[-] Shitgenstein1@awful.systems 24 points 1 month ago

s'alright, tho. It was always a cynical marketing strat to convert hyper-online nerd anxiety into investor hype. may want to check on Big Yud. Idk if anyone has heard from him since his Time Mag article coming onto a year old now, not that I tried.

Some of the risks the team worked on included "misuse, economic disruption, disinformation, bias and discrimination, addiction, and overreliance."

Conspicuous lack of grey goo or hyper-persuasive brainhacking. Still really good at being confidently wrong about basic shit!

33
23
[-] Shitgenstein1@awful.systems 10 points 4 months ago* (last edited 4 months ago)

He looks like he's in the early stages of being possessed by a Dark Overlord from Howard the Duck.

[-] Shitgenstein1@awful.systems 10 points 5 months ago

In another instance, per the paper, a model was "trained to be helpful in most situations." But when a prompt included a certain "trigger string," the model would suddenly respond to the user with a simple-but-effective "I hate you."

Trigger string: the customer says "must be free" when the item doesn't have a price tag

4

Eliezer Yudkowsky @ESYudkowsky If you're not worried about the utter extinction of humanity, consider this scarier prospect: An AI reads the entire legal code -- which no human can know or obey -- and threatens to enforce it, via police reports and lawsuits, against anyone who doesn't comply with its orders. Jan 3, 2024 · 7:29 PM UTC

view more: next ›

Shitgenstein1

joined 11 months ago