this post was submitted on 08 Oct 2024
225 points (93.1% liked)

Technology

59574 readers
3476 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 46 comments
sorted by: hot top controversial new old
[–] expatriado@lemmy.world 67 points 1 month ago (4 children)

Physics' Nobel prize awarded for a Computer Science achievement, actual physics is having a dry spell I guess

[–] kamenlady@lemmy.world 39 points 1 month ago (1 children)

Beyond recognizing the laureates’ inspirations from condensed-matter physics and statistical mechanics, the prize celebrates interdisciplinarity. At its core, this prize is about how elements of physics have driven the development of computational algorithms to mimic biological learning, impacting how we make discoveries today across STEM. 

They explain the flex at least

[–] sugar_in_your_tea@sh.itjust.works 13 points 1 month ago (1 children)

Seems like a pretty extreme flex, I'm worried it'll snap.

[–] catloaf@lemm.ee 5 points 1 month ago

If they award a Nobel for materials science, this should win.

[–] demesisx@infosec.pub 11 points 1 month ago

Narrator: It isn’t.

[–] CarbonIceDragon@pawb.social 3 points 1 month ago (1 children)

To be fair, regardless of one's stance on the utility of current AI or the wisdom of developing it, it is an extremely difficult and potentially world changing technical achievement, and given there isn't a computer science prize, physics is probably the most relevant category to it

[–] vrighter@discuss.tchncs.de 10 points 1 month ago (1 children)

not really. A lot of techniques have been known for decades. What we didn't have back then was insane compute power.

and there's the turing award for computer science.

[–] PixelProf@lemmy.ca 7 points 1 month ago* (last edited 1 month ago)

Insane compute wasn't everything. Hinton helped develop the technique which allowed more data to be processed in more layers of a network without totally losing coherence. It was more of a toy before then because it capped out at how much data could be used, how many layers of a network could be trained, and I believe even that GPUs could be used efficiently for ANNs, but I could be wrong on that one.

Either way, after Hinton's research in ~2010-2012, problems that seemed extremely difficult to solve (e.g., classifying images and identifying objects in images) became borderline trivial and in under a decade ANNs went from being almost fringe technology that many researches saw as being a toy and useful for a few problems to basically dominating all AI research and CS funding. In almost no time, every university suddenly needed machine learning specialists on payroll, and now at about 10 years later, every year we are pumping out papers and tech that seemed many decades away... Every year... In a very broad range of problems.

The 580 and CUDA made a big impact, but Hinton's work was absolutely pivotal in being able to utilize that and to even make ANNs seem feasible at all, and it was an overnight thing. Research very rarely explodes this fast.

Edit: I guess also worth clarifying, Hinton was also one of the few researching these techniques in the 80s and has continued being a force in the field, so these big leaps are the culmination of a lot of old, but also very recent work.

[–] skillissuer@discuss.tchncs.de 1 points 1 month ago (1 children)

i'm here to remind you that for last 20ish years half of the time chemistry nobel goes to biologists, and now they doubled down on ai wankery with giving it to alphafold

[–] AnarchistArtificer@slrpnk.net 6 points 1 month ago

To be fair, AlphaFold is pretty incredible. I remember when it was first revealed (but before they open sourced parts of it) that the scientific community were shocked by how effective it was and assumed that it was going to be technologically way more complex than it ended up being. Systems Biologist Mohammed AlQuraishi captures this quite well in this blog post

I'm a biochemist who has more interest in the computery side of structural biology than many of my peers, so I often have people asking me stuff like "is AlphaFold actually as impressive as they say, or is it just more overhyped AI nonsense?". My answer is "Yes."

[–] lambda_notation@lemmy.ml 58 points 1 month ago (2 children)

"They used physics to do it" is just a laughably pathetic motivation. Nobel hated "abstract wankery" or "intellectual masturbation" and wanted to promote results which benefitted the common man and society directly. This is incidentally also why there doesn't exist a Nobel prize in economics. The nobel prize comitte has since long abandoned Nobel's will in this matter and it is anyones guess what the order of magnitude of spin Nobel's corpse has accumulated.

it is anyones guess what the order of magnitude of spin Nobel’s corpse has accumulated.

I'm guessing it's nearing the theoretical limits of "abstract wankery."

[–] prole@sh.itjust.works -3 points 1 month ago* (last edited 1 month ago) (1 children)

The nobel prize comitte has since long abandoned Nobel's will in this matter

How long? Wouldn't this just kind of suggest that criteria is simply different at this point? Maybe complex electronic devices didn't exist back then, so one can really only guess as to what he would think. Because he's been dead for a very long time...

The prize is named after the dude, he doesn't get to decide the rules of its award in perpetuity.

[–] Knock_Knock_Lemmy_In@lemmy.world 17 points 1 month ago

The prize is named after the dude, he doesn't get to decide the rules of its award in perpetuity.

Yes he does. It's the law.

Nobel's last will specified that his fortune be used to create a series of prizes for those who confer the "greatest benefit on mankind"

[–] msantossilva@sh.itjust.works 27 points 1 month ago (2 children)

I guess some people are genuinely concerned about AI wiping out humanity. Do not worry, that will never happen. We are already doing a fine job fostering our own extinction. If we keep going down our current path, those soulless robots will never even get the chance.

Now, in truth, I do not know what will kill us first, but I reckon it is important to stay positive

[–] slaacaa@lemmy.world 9 points 1 month ago (2 children)

I mean it’s definitely helping, but not in the way I imagined. It is becoming a major driver of CO2 emissions due to the large computational power if needs, which will only increase in the future. The planet is boiling, and they will keep building more server farms for the next LLM upgrade, giving up on stopping/controlling climate change.

[–] mindaika@lemmy.dbzer0.com 7 points 1 month ago

Wouldn’t that be something: we choke to death trying to create a supercomputer to tell us to stop doing exactly that

True irony

[–] Zos_Kia@lemmynsfw.com -1 points 1 month ago (1 children)

To clarify: AI is NOT a major driver of CO2 emissions. The most pessimistic estimations place it at a fraction of a percent of global energy consumption by 2030.

[–] mriormro@lemmy.world 2 points 1 month ago (1 children)
[–] Zos_Kia@lemmynsfw.com 1 points 1 month ago (1 children)

I mean it is also true for crypto. BTC, the most energy-hungry blockchain, is estimated to burn ~150TWh/year, compared to a global consumption of 180 000TWh/y.

Now is that consumption useless ? Yes, it is completely wasted. But it is a drop in the bucket. One shouldn't underestimate the astounding energy consumption of legacy industries - as a whole the tech industry is estimated to represent just a few percents of the global energy budget.

[–] dubyakay@lemmy.ca 0 points 1 month ago* (last edited 1 month ago) (1 children)

Meh. Feelings are louder than facts apparently.

[–] Zos_Kia@lemmynsfw.com 1 points 1 month ago

some myths are hard to kill honestly

[–] scarabic@lemmy.world 4 points 1 month ago

What’s laughable are the “terminator” scenarios where it suddenly comes to life in an instant and in that moment already has the power to wipe us out, and then does so.

A more likely scenario is that we come to rely heavily on AI more and more as time goes by, until it truly does have a grip on resource supply chains, manufacturing facilities, energy plants, etc. And I don’t just mean that machine learning gets used in all of those contexts because we are already there. I’m talking about custodial authority. We’ve ceded those duties to it in large part - can’t do those jobs without AI.

Then a malicious AI could put a real squeeze on humanity. It wouldn’t need to be a global war. Just enough disruption that we starve and begin to war among ourselves. Has anyone ever noticed how many of us there are now? Our population would absolutely fall apart without our massive industrial and agricultural complexes running full time.

[–] Letstakealook@lemm.ee 16 points 1 month ago (5 children)

Predictive text algorithms will not wipe out humanity. 🙄

[–] SkyNTP@lemmy.ml 15 points 1 month ago

The problem isn't the technology. The problem is the people losing their minds about it.

[–] keropoktasen@lemmy.world 9 points 1 month ago (1 children)

The Artificial Intelligence field is much broader than that limited definition of yours.

[–] Letstakealook@lemm.ee 3 points 1 month ago (1 children)

And yet the only tech any company is interested in using is LLMs, which are about to fall flat on their face. What tech in the field is close to being able to think for itself and truly act autonomously?

[–] RatherBeMTB@sh.itjust.works -2 points 1 month ago* (last edited 1 month ago) (2 children)

I can tell you don't use AI. It's frightening how good it is. Edited "good"😂

[–] Letstakealook@lemm.ee 5 points 1 month ago

That's just true believer talk. It really is trash.

[–] dubyakay@lemmy.ca 4 points 1 month ago (1 children)

It's frightening how God it is.

Intentional?

[–] phdepressed@sh.itjust.works 2 points 1 month ago

Written by AI?

[–] EtherWhack@lemmy.world 5 points 1 month ago (1 children)
[–] Letstakealook@lemm.ee 5 points 1 month ago

That doesn't really count, lol. The reality is, we've already killed ourselves, we just won't admit it yet. The climate effects we're seeing today aren't even from recent emissions. Mr. Bones Wild Ride has only just begun, and there's no getting off.

[–] ParetoOptimalDev 3 points 1 month ago (1 children)

Maybe the Nobel should have went to you.

[–] Letstakealook@lemm.ee 2 points 1 month ago

The prize has nothing to do with these claims. Furthermore, past accomplishments do not make a person infallible. Nice ad hominem, though.

[–] sunbeam60@lemmy.one 0 points 1 month ago (1 children)

Such a lame hot take. Do you understand how language models work? To claim there’s no higher order understanding is frankly laughable.

[–] Letstakealook@lemm.ee 2 points 1 month ago (1 children)

If you legitimately believe llms "understand" anything at all, I really don't believe there's anything to discuss with you. That is a completely absurd notion at this stage.

[–] sunbeam60@lemmy.one 1 points 1 month ago

Well, why don’t you argue with the guy who spearheaded the backpropagation algorithm, spends his whole day thinking about it and who won the Nobel Prize in Physics, rather than me? I’m not saying some fanciful notion that isn’t supported by evidence. If they just predict text, how can they solve riddles theyve never encountered in their training materials? Are you claiming the logic solution is just text statistics?

[–] vrighter@discuss.tchncs.de 10 points 1 month ago

and physicists use tools from math, so fields medals should be awarded to physicists.

[–] zlatiah@lemmy.world 6 points 1 month ago

So it was the physics Nobel... I see why the Nature News coverage called it "scooped" by machine learning pioneers

Since the news tried to be sensational about it... I tried to see what Hinton meant by fearing the consequences. Believe he is genuinely trying to prevent AI development without proper regulations. This is a policy paper he was involved in (https://managing-ai-risks.com/). This one did mention some genuine concerns. Quoting them:

"AI systems threaten to amplify social injustice, erode social stability, and weaken our shared understanding of reality that is foundational to society. They could also enable large-scale criminal or terrorist activities. Especially in the hands of a few powerful actors, AI could cement or exacerbate global inequities, or facilitate automated warfare, customized mass manipulation, and pervasive surveillance"

like bruh people already lost jobs because of ChatGPT, which can't even do math properly on its own...

Also quite some irony that the preprint has the following quote: "Climate change has taken decades to be acknowledged and confronted; for AI, decades could be too long.", considering that a serious risk of AI development is climate impacts

[–] Pieresqi@lemmy.world 5 points 1 month ago

Yeesh, everyone now jumps on the ai hypetrain.

[–] Etterra@lemmy.world 5 points 1 month ago (1 children)

I mean we do kind of deserve it. But at least we've had a good run.

[–] sunbeam60@lemmy.one 0 points 1 month ago

Was our run really that good? We killed a bunch of species, drained our planet of resources and belched pollution into the air. I wouldn’t be surprised if the AIs manage to steward our planet better.

[–] mindaika@lemmy.dbzer0.com 4 points 1 month ago* (last edited 1 month ago)

It’s probably easier to righteously quit your job after a decade of collecting senior executive salary

Also: physics?