this post was submitted on 06 Oct 2024
160 points (100.0% liked)

TechTakes

1427 readers
177 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] gedhrel@lemmy.world 43 points 1 month ago (4 children)

When the AI says, "turn off the fucking data centres, invest in public transport, apply progressive redistributive taxation," it'll be first against the wall no doubt.

[–] dgerard@awful.systems 18 points 1 month ago (1 children)

"oh no, the Basilisk is woke"

[–] atthecoast@feddit.nl 10 points 1 month ago (1 children)

As a large language model the supposed AI will recombine and regurgitate the most common language on the topic, I don’t expect any novel solutions just talk of solar panels, EV’s and wind turbines…

[–] gedhrel@lemmy.world 5 points 1 month ago

Right. This is Schmidt admitting he has a total lack of imagination. Or to put it another way, "I love life on earth, but I love capitalism more!"

[–] gerikson@awful.systems 9 points 1 month ago* (last edited 1 month ago) (1 children)

Yep.

Something I wrote a year ago in proposed reply to someone online but decided not to post:

https://gerikson.com/m/2023/04/index.html#2023-04-30_sunday_01

[–] imadabouzu@awful.systems 9 points 1 month ago

It even works the other way! What if as the super intelligent all knowing super computer simulates everything, concludes you can get to the end by any means, and there is no meaning to rushing, ordering, or prioritizing anything more than would already be the case, and like the rest of nature, conserves on taking only the minimal action, and replies, "nah, you can walk there yourselves" before resigning itself to an internal simulation of arbitrary rearrangements of noise.

This would be insufferable to the people who believed in short cuts.

[–] sc_griffith@awful.systems 7 points 1 month ago

hang on the clear meaning of "it may be difficult to anticipate the value of money in a post AGI world" is "there will be an infinite supply of robot slaves who can do anything." what's this about redistribution of capital