BlueMonday1984

joined 1 year ago
[–] BlueMonday1984@awful.systems 12 points 2 weeks ago (3 children)

Got a pair of notable things I ran across recently.

Firstly, an update on Grok's White Genocide Disaster: the person responsible has seemingly revealed themselves, and shown off how they derailed Grok's prompt.. The pull request that initiated this debacle has been preserved on the Internet Archive.

Second, I ran across a Bluesky post which caught my attention:

You want my opinion on the "scab" comment, its another textbook example of the all-consuming AI backlash, one that suggests any usage of AI will be viewed as an open show of hostility towards labour.

[–] BlueMonday1984@awful.systems 6 points 2 weeks ago

TBF, Zitron has "wrap[ped] together everything I've written going back to [his] remote work coverage from 2021" with this piece - it was naturally gonna be long as hell.

Making an MCU comparison is deeply cliche at this point, but Zitron basically penned his personal equivalent to Avengers: Endgame with this - the impressive length is completely warranted.

[–] BlueMonday1984@awful.systems 4 points 2 weeks ago (1 children)

Maybe this is a bit old woman yells at cloud – but I’d lie if I said I wasn’t worried about language proficiency atrophying in the population

AI's already destroying people's cognitive abilities as we speak, I wouldn't be shocked if language proficiency went down the shitter, too. Hell, you could argue it'll fuck up human's capacity to make/understand art - Nathan Hamiel of Perilous Tech already did.

(and leading to me having to read slop all the time)

Thankfully, I've managed to avoid reading/seeing slop for the most part. Spending most of my time on Newgrounds probably helped, for three main reasons:

  1. AI slop was banned from being uploaded back in 2022 (very early into the bubble), making it loud and clear that AI slop is unwelcome there. (Sidenote: A dedicated AI flag option was added in 2024)
  2. The site primarily (if not near-exclusively) attracts artists, animators, musicians, and creatives in general - all groups who (for obvious reasons) are strongly opposed to gen-AI in all its forms, and who will avoid anything involving AI like the fucking plague.
  3. The site is (practically) ad-free, meaning ad revenue is effectively zero - as such, setting up an AI slop farm (or a regular content mill) is utterly impractical, since you'd have zero shot of turning a profit.

(That I'm a NEET also helps (can't have AI bro coworkers if you're unemployed :P), but any opportunity to promote the AI-free corners of the net is always a good one in my books :P)

[–] BlueMonday1984@awful.systems 12 points 2 weeks ago

Update on the Artificial Darth Debacle: SAG-AFTRA just sued Epic for using AI for Darth Vader in the first place:

You want my take, this is gonna be a tough case for SAG - Jones signed off on AI recreations of Vader before his death in 2024, so arguing a lack of consent's off the table right from the get-go.

If SAG do succeed, the legal precedent set would likely lead to a de facto ban on recreating voices using AI. Given SAG-AFTRA's essentially saying that what Epic did is unethical on principle, I suspect that's their goal here.

[–] BlueMonday1984@awful.systems 5 points 2 weeks ago

That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

That goes some way to explaining why programmers don't have a moral compass.

[–] BlueMonday1984@awful.systems 10 points 2 weeks ago

When there’s a balanced incentive to point out hallucinations, they (hopefully) won’t get that far.

That I can see. Unlike software "engineering", law is a field which has high and exacting standards - and faltering even slightly can lead to immediate and serious consequences.

[–] BlueMonday1984@awful.systems 7 points 3 weeks ago (2 children)

I can see that working.

The basic conceit of Idiocracy is that its a dystopia run by complete and utter morons, and with AI's brain-rotting effects being quite well known, swapping the original plotline's eugenicist "dumb outbreeding the smart" setup with an overtly anti-AI "AI turned humanity dumb" setup should be a cakewalk. Given public sentiment regarding AI is pretty strongly negative, it should also be easy to sell to the public.

[–] BlueMonday1984@awful.systems 5 points 3 weeks ago (1 children)

An actual Kendrick in this case would be Ed Zitron - pointing to just one example, Ed brutally torched Prabhakar Raghavan for ruining Google Search, then danced on his grave after he stepped down to a ceremonial role.

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

Gonna add the opening quote, because it is glorious:

You cannot make friends with the rock stars...if you're going to be a true journalist, you know, a rock journalist. First, you never get paid much, but you will get free records from the record company.

[There’s] fuckin’ nothin' about you that is controversial. God, it's gonna get ugly. And they're gonna buy you drinks, you're gonna meet girls, they're gonna try to fly you places for free, offer you drugs. I know, it sounds great, but these people are not your friends. You know, these are people who want you to write sanctimonious stories about the genius of the rock stars and they will ruin rock 'n' roll and strangle everything we love about it.

Because they're trying to buy respectability for a form that's gloriously and righteously dumb.

Lester Bangs, Almost Famous (2000)

EDITED TO ADD: If you want a good companion piece to this, Devs and the Culture of Tech by @UnseriousAcademic is a damn good read, going deep into the cultural issues which leads to the fawning tech press Zitron so thoroughly tears into.

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week’s thread

(Semi-obligatory thanks to @dgerard for starting this)

 

None of what I write in this newsletter is about sowing doubt or "hating," but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom — is (as I've said before) unsustainable, and will ultimately collapse. I also fear that said collapse could be ruinous to big tech, deeply damaging to the startup ecosystem, and will further sour public support for the tech industry.

Can't blame Zitron for being pretty downbeat in this - given the AI bubble's size and side-effects, its easy to see how its bursting can have some cataclysmic effects.

(Shameless self-promo: I ended up writing a bit about the potential aftermath as well)

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Semi-obligatory thanks to @dgerard for starting this)

 

Whilst going through MAIHT3K's backlog, I ended up running across a neat little article theorising on the possible aftermath which left me wondering precisely what the main "residue", so to speak, would be.

The TL;DR:

To cut a long story far too short, Alex, the writer, theorised the bubble would leave a "sticky residue" in the aftermath, "coating creative industries with a thick, sooty grime of an industry which grew expansively, without pausing to think about who would be caught in the blast radius" and killing or imperilling a lot of artists' jobs in the process - all whilst producing metric assloads of emissions and pushing humanity closer to the apocalypse.

My Thoughts

Personally, whilst I can see Alex's point, I think the main residue from this bubble is going to be large-scale resentment of the tech industry, for three main reasons:

  1. AI Is Shafting Everyone

Its not just artists who have been pissed off at AI fucking up their jobs, whether freelance or corporate - as Upwork, of all places, has noted in their research, pretty much anyone working right now is getting the shaft:

  • Nearly half (47%) of workers using AI say they have no idea how to achieve the productivity gains their employers expect

  • Over three in four (77%) say AI tools have decreased their productivity and added to their workload in at least one way

  • Seventy-one percent are burned out and nearly two-thirds (65%) report struggling with increasing employer demands

  • Women (74%) report feeling more burned out than do men (68%)

  • 1 in 3 employees say they will likely quit their jobs in the next six months because they are burned out or overworked (emphasis mine)

Baldur Bjarnason put it better than me when commenting on these results:

It’s quite unusual for a study like this on a new office tool, roughly two years after that tool—ChatGPT—exploded into people’s workplaces, to return such a resoundingly negative sentiment.

But it fits with the studies on the actual functionality of said tool: the incredibly common and hard to fix errors, the biases, the general low quality of the output, and the often stated expectation from management that it’s a magic fix for the organisational catastrophe that is the mass layoff fad.

Marketing-funded research of the kind that Upwork does usually prevents these kind of results by finessing the questions. They simply do not directly ask questions that might have answers they don’t like.

That they didn’t this time means they really, really did believe that “AI” is a magic productivity tool and weren’t prepared for even the possibility that it might be harmful.

Speaking of the general low-quality output:

  1. The AI Slop-Nami

The Internet has been flooded with AI-generated garbage. Fucking FLOODED.

Doesn't matter where you go - Google, DeviantArt, Amazon, Facebook, Etsy, Instagram, YouTube, Sports Illustrated, fucking 99% of the Internet is polluted with it.

Unsurprisingly, this utter flood of unfiltered unmitigated endless trash has sent AI's public perception straight down the fucking toilet, to the point of spawning an entire counter-movement against the fucking thing.

Whether it be Glaze and Nightshade directly sabotaging datasets, "Made with Human Intelligence" and "Not By AI" badges proudly proclaiming human-done production or Cara blowing up by offering a safe harbour from AI, its clear there's a lot of people out there who want abso-fucking-lutely nothing to do with AI in any sense of the word as a result of this slop-nami.

  1. The Monstrous Assholes In AI

On top of this little slop-nami, those leading the charge of this bubble have been generally godawful human beings. Here's a quick highlight reel:

I'm definitely missing a lot, but I think this sampler gives you a good gist of the kind of soulless ghouls who have been forcing this entire fucking AI bubble upon us all.

Eau de Tech Asshole

There are many things I can't say for sure about the AI bubble - when it will burst, how long and harsh the next AI/tech winter will be, what new tech bubble will pop up in its place (if any), etcetera.

One thing I feel I can say for sure, however, is that the AI bubble and its myriad harms will leave a lasting stigma on the tech industry once it finally bursts.

Already, it seems AI has a pretty hefty stigma around it - as Baldur Bjaranason noted when talking about when discussing AI's sentiment disconnect between tech and the public:

To many, “AI” seems to have become a tech asshole signifier: the “tech asshole” is a person who works in tech, only cares about bullshit tech trends, and doesn’t care about the larger consequences of their work or their industry. Or, even worse, aspires to become a person who gets rich from working in a harmful industry.

For example, my sister helps manage a book store as a day job. They hire a lot of teenagers as summer employees and at least those teens use “he’s a big fan of AI” as a red flag. (Obviously a book store is a biased sample. The ones that seek out a book store summer job are generally going to be good kids.)

I don’t think I’ve experienced a sentiment disconnect this massive in tech before, even during the dot-com bubble.

On another front, there's the cultural reevaluation of the Luddites - once brushed off as naught but rejectors of progress, they are now coming to be viewed as folk heroes in a sense, fighting against misuse of technology to disempower and oppress, rather than technology as a whole.

There's also the rather recent SAG-AFTRA strike which kicked off just under a year after the previous one, and was started for similar reasons - to protect those working in the games industry from being shafted by AI like so many other people.

With how the tech industry was responsible for creating this bubble at every stage - research, development, deployment, the whole nine yards - it is all but guaranteed they will shoulder the blame for all that its unleashed. Whatever happens after this bubble, I expect hefty scrutiny and distrust of the tech industry for a long, long time after this.

To quote @datarama, "the AI industry has made tech synonymous with “monstrous assholes” in a non-trivial chunk of public consciousness" - and that chunk is not going to forget any time soon.

 

I've been hit by inspiration whilst dicking about on Discord - felt like making some off-the-cuff predictions on what will happen once the AI bubble bursts. (Mainly because I had a bee in my bonnet that was refusing to fuck off.)

  1. A Full-Blown Tech Crash

Its no secret the industry's put all their chips into AI - basically every public company's chasing it to inflate their stock prices, NVidia's making money hand-over-fist playing gold rush shovel seller, and every exec's been hyping it like its gonna change the course of humanity.

Additionally, going by Baldur Bjarnason, tech's chief goal with this bubble is to prop up the notion of endless growth so it can continue reaping the benefits for just a bit longer.

If and when the tech bubble pops, I expect a full-blown crash in the tech industry (much like Ed Zitron's predicting), with revenues and stock prices going through the floor and layoffs left and right. Additionally, I'm expecting those stock prices will likely take a while to recover, if ever, as tech likely comes to be viewed either as a stable, mature industry that's no longer experiencing nonstop growth.

Chance: Near-Guaranteed. I'm pretty much certain on this, and expect it to happen sometime this year.

  1. A Decline in Tech/STEM Students/Graduates

Extrapolating a bit from Prediction 1, I suspect we might see a lot less people going into tech/STEM degrees if tech crashes like I expect.

The main thing which drew so many people to those degrees, at least from what I could see, was the notion that they'd make you a lotta money - if tech publicly crashes and burns like I expect, it'd blow a major hole in that notion.

Even if it doesn't kill the notion entirely, I can see a fair number of students jumping ship at the sight of that notion being shaken.

Chance: Low/Moderate. I've got no solid evidence this prediction's gonna come true, just a gut feeling. Epistemically speaking, I'm firing blind.

  1. Tech/STEM's Public Image Changes - For The Worse

The AI bubble's given us a pretty hefty amount of mockery-worthy shit - Mira Murati shitting on the artists OpenAI screwed over, Andrej Karpathy shitting on every movie made pre-'95, Sam Altman claiming AI will soon solve all of physics, Luma Labs publicly embarassing themselves, ProperPrompter recreating motion capture, But Worse^tm, Mustafa Suleyman treating everything on the 'Net as his to steal, et cetera, et cetera, et fucking cetera.

All the while, AI has been flooding the Internet with unholy slop, ruining Google search, cooking the planet, stealing everyone's work (sometimes literally) in broad daylight, supercharging scams, killing livelihoods, exploiting the Global South and God-knows-what-the-fuck-else.

All of this has been a near-direct consequence of the development of large language models and generative AI.

Baldur Bjarnason has already mentioned AI being treated as a major red flag by many - a "tech asshole" signifier to be more specific - and the massive disconnect in sentiment tech has from the rest of the public. I suspect that "tech asshole" stench is gonna spread much quicker than he thinks.

Chance: Moderate/High. This one's also based on a gut feeling, but with the stuff I've witnessed, I'm feeling much more confident with this than Prediction 2. Arguably, if the cultural rehabilitation of the Luddites is any indication, it might already be happening without my knowledge.

If you've got any other predictions, or want to put up some criticisms of mine, go ahead and comment.

view more: ‹ prev next ›