this post was submitted on 13 Nov 2024
574 points (95.6% liked)

Science Memes

11068 readers
2875 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 21 comments
sorted by: hot top controversial new old
[–] azi@mander.xyz 7 points 2 days ago

There's plenty of stuff where ML algorithms the state of the art. For example the raw data from nanopore DNA sequencing machines is extremely noisy and ML algorithms clean it up with much less error than the Markov chains used in years previous.

[–] slackassassin@sh.itjust.works 50 points 3 days ago (1 children)

Working with pretrained models implemented in FPGAs for particle identification and tracking. It's much faster and exactly as accurate. ¯\_(ツ)_/¯

[–] daniskarma@lemmy.dbzer0.com 21 points 3 days ago* (last edited 3 days ago)

Run, the butlerian jihad is already going your way.

[–] BugleFingers@lemmy.world 3 points 1 day ago (1 children)

A lot of new tech is not as efficient or equally so at the get go. Learning how to properly implement and utilize it is part of the process.

Right now we are just throwing raw computing power in ML format at it. As soon as it catches and shows a little promise in an area we can focus and refine. Sometimes you need to use the shotgun to see the rabbits ya know?

[–] rando895@lemmygrad.ml 3 points 1 day ago

Physicists abhor a black box. So long as it is an option, most will choose not to use AI to any great extent, and will chastise those who do.

[–] Clent@lemmy.world 26 points 3 days ago (1 children)

The actual model required for general purpose likely lies beyond the range of petabytes of memory.

These models are using gigabytes and the trend indicates its exponential. A couple more gigabytes isn't going to cut it. Layers cannot expand the predictive capabilities without increasing the error. I'm sure a proof of that will be along within in the next few years.

[–] Krauerking@lemy.lol 9 points 2 days ago* (last edited 2 days ago)

"Come on man, I just need a couple more pets of your data and I will totally be able to predict you something useful!".
It's capacitors flip polarity in anticipation.

"I swear man! It's only a couple of orders of magnitude more, man! And all your dreams will come true. I'm sure I'll service you right!"

Well if it needs it, right?

[–] Nasan@sopuli.xyz 4 points 2 days ago
[–] fckreddit@lemmy.ml 11 points 3 days ago (1 children)

"There is no free lunch.", is a saying in ML research.

[–] SturgiesYrFase@lemmy.ml 15 points 3 days ago

That's just a saying.

[–] Dirac 13 points 3 days ago (1 children)
[–] propter_hog@hexbear.net 10 points 3 days ago

GET YOUR SHIT TOGETHER, CORAL

[–] belated_frog_pants@beehaw.org 3 points 2 days ago (1 children)

Ai sucks ass, stop using it

It doesn't. It's just overhyped.

[–] Alexstarfire@lemmy.world 7 points 3 days ago

For the meme? The Walking Dead. For the content? No idea.

[–] Collatz_problem@hexbear.net 4 points 3 days ago (1 children)

It is not even faster usually.

[–] propter_hog@hexbear.net 15 points 3 days ago

And if it is faster, it just converges to the wrong answer faster

[–] bigbrowncommie69@hexbear.net 2 points 3 days ago

Pretty much the only thing it's even remotely good for is as a toy.

[–] Reddfugee42@lemmy.world -2 points 3 days ago

So what you're saying, Dad, is it's nascent and already faster? Gotcha.