this post was submitted on 13 Oct 2024
18 points (100.0% liked)

TechTakes

1427 readers
336 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

you are viewing a single comment's thread
view the rest of the comments
[–] BlueMonday1984@awful.systems 17 points 1 month ago (1 children)

New pair of Tweets from Zitron just dropped:

I also put out a lengthy post about AI's future on MoreWrite - go and read it, its pretty cool

Boo! Hiss! Bring Saltman back out! I want unhinged conspiracy theories, damnit.

It feels like this is supposed to be the entrenchment, right? Like, the AGI narrative got these companies and products out into the world and into the public consciousness by promising revolutionary change, and now this fallback position is where we start treating the things that have changed (for the worse) as fair accompli and stop whining. But as Ed says, I don't think the technology itself is capable of sustaining even that bar.

Like, for all that social media helped usher in surveillance capitalism and other postmodern psychoses, it did so largely by providing a valuable platform for people to connect in new ways, even if those ways are ultimately limited and come with a lot of external costs. Uber came into being because providing an app-based interface and a new coat of paint on the taxi industry hit on a legitimate market. I don't think I could have told you how to get a cab in the city I grew up in before Uber, but it's often the most convenient way to get somewhere in that particular hell of suburban sprawl unless you want to drive yourself. And of course it did so by introducing an economic model that exploits the absolute shit out of basically everyone involved.

In both cases, the thing that people didn't like was external or secondary to the thing people did like. But with LLMs, it seems like the thing people most dislike is also the main output of the system. People don't like AI art, they don't like interacting with chatbots in basically anywhere, and the confabulation problems undercut their utility for anything where correlation to the real world actually matters, leaving them somewhere between hilariously and dangerously inept at many of the functions they're still being pitched for.