this post was submitted on 08 Apr 2024
72 points (85.3% liked)

Technology

60036 readers
2734 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

The Hated One has been pretty solid in the past regarding privacy/security, imho. I found this video of his rather enlightening and concerning.

  • LLMs and their training consume a LOT of power, which consumes a lot of water.
  • Power generation and data centers also consume a lot of water.
  • We don't have a lot of fresh water on this planet.
  • Big Tech and other megacorps are already trying to push for privatizing water as it becomes more scarce for humans and agriculture.

---personal opinion---

This is why I personally think federated computing like Lemmy or PeerTube to be the only logical way forward. Spreading out the internet across infrastructure nodes that can be cooled by fans in smaller data centers or even home server labs is much more efficient than monstrous, monolithic datacenters that are stealing all our H2O.

Of course, then the 'Net would be back to serving humanity instead of stock-serving megacultists. . .

you are viewing a single comment's thread
view the rest of the comments
[–] CarbonatedPastaSauce@lemmy.world 25 points 8 months ago (1 children)

We have an absolute shitton of fresh water on the planet. It’s just being horribly mismanaged most of the time.

Once the AI gets rid of the pesky humans using it frivolously to do stupid things like “drink” or “bathe”, there will be plenty to go around.

[–] umbrella@lemmy.ml 2 points 8 months ago (1 children)

thats if ai ever gets "sentient" in our lifetimes like the suits keep insisting it will

[–] MonkeMischief 2 points 8 months ago (1 children)

I don't forsee it becoming "sentient" so much as "Being given a stupid amount of access and resources to figure out a problem by itself, and stupidly pursuing the maximization of that goal with zero context."

There's that darkly humorous hypothetical that an Ai tasked with maximizing making paperclips would continue to do so, using every resource it could get a hold of, and destroying any threat to further paperclip production!

So that, with data center expansion and water. Lol

See "paperclip maximizer" under "hypothetical examples" Here: https://en.m.wikipedia.org/wiki/Instrumental_convergence

[–] umbrella@lemmy.ml 2 points 8 months ago (1 children)

oh this is happening today. the ultra-addictive social media thing is mostly through machine learning algos being tuned to do this regardless of anything else.

[–] MonkeMischief 2 points 8 months ago* (last edited 8 months ago)

EXACTLY. High-five!

That's what I worry about. Right now we can ignore social media somewhat, but if Ai gets wedged into contracts with government/infrastructure and other unavoidable daily life, I imagine that's where a plausible threat could come from.

I've no doubt such things are already in the works. Ai controlled traffic lights or something, for instance. Obviously the military and law enforcement are already giddy about it, of course.

Giving a stupid machine a seemingly simple goal to pursue and the wrong set of keys could lead to disasterous consequences, I think. We also have the whole "Do Ai cars protect the driver or all human life even if it risks the driver?" Debate.

"But it's trendy, it's the future! And there's so much venture capital involved, how lucrative!" Seems to be how major decisions are made these days.

I don't see it some day "waking up" and thinking "I feel like humans are unnecessary." It's scarier than that...it will see us as just another variable to control and "maximize" us out of the picture.