this post was submitted on 21 Aug 2024
787 points (95.3% liked)

Technology

59087 readers
3433 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Saledovil@sh.itjust.works 6 points 2 months ago (3 children)

The thing with AI is, what the term today refers to most often is neural networks, which are really advanced statistics. And the thing is, to get more precise statistics, you need exponentially more data. And of course the marginal utility decays exponentially. So exponentially increasing marginal expenses meet exponentially decaying marginal utility.

[–] HK65@sopuli.xyz 12 points 2 months ago

Just to be clear, I am in love with statistics and especially generative algos, and have written papers on it before ChatGPT was a thing.

I just hate that one company made a chatbot with it and now the whole world is cargo culting around it.

[–] model_tar_gz@lemmy.world 4 points 2 months ago* (last edited 2 months ago) (1 children)

AI is a very broad term that also includes expert systems (such as Computational Fluid Dynamics, Finite Element Analysis, etc approaches.). Traditional machine learning approaches (like support vector machines, etc.) too. But yes, I agree—most commonly associated with deep learning/neural network approaches.

That said, it’s misleading and inaccurate to state that neural networks are just statistics. In fact they are substantially more than just advanced statistics. Certainly statistics is a component—but so too is probability, calculus, network/graph theory, linear algebra, not to mention computer science to program, tune, and train and infer them. Information theory (hello, entropy) plays a part sometimes.

The amount of mathematical background it takes to really understand and practice the theory of both a forward pass and backpropagation is an entire undergraduate STEM curriculum’s worth. I usually advocate for new engineers in my org to learn it top down (by doing) and pull the theory as needed, but that’s not how I did it and I regularly see gaps in their decisions because of it.

And to get actually good at it? One does not simply become a AI systems engineer/technologist. It’s years of tinkering with computers and operating systems, sourcing/scraping/querying/curating data, building data pipelines, cleaning data, engineering types of modeling approaches for various data types and desired outcomes against constraints (data, compute, economic, social/political), implementing POCs, finetuning models, mastering accelerated computing (aka GPUs, TPUs), distributed computation—and many others I’m sure I’m forgetting some here. The number of adjacent fields I’ve had to deeply scratch on to make any of this happen is stressful just thinking about it.

They’re fascinating machines, and they’ve been democratized/abstracted to an extent where it’s now as simple as import torch, torch.fit, model.predict. But to be dismissive of the amazing mathematics and engineering under the hood to make them actually usable is disingenuous.

I admit I have a bias here—I’ve spent the majority of my career building and deploying NN models.

[–] Saledovil@sh.itjust.works 2 points 2 months ago* (last edited 2 months ago) (1 children)

That said, it’s misleading and inaccurate to state that neural networks are just statistics. In fact they are substantially more than just advanced statistics. Certainly statistics is a component—but so too is probability, calculus, network/graph theory, linear algebra, not to mention computer science to program, tune, and train and infer them. Information theory (hello, entropy) plays a part sometimes.

What I meant when I said that they are advanced statistics is that that is what they do. I know that a lot of disciplines play a part in creating them. I know it's incredible complicated, it took me quite a while to wrap my head around what the back-propagation algorithm.

I also know that neural networks can do some really cool stuff. Recognizing tumors, for example. But it's equally dangerous to overestimate them, so we have to be aware of their limitations.

Edit: All that being said, I do recognize that you have spent much more time learning about and working with neural networks than I have.

[–] model_tar_gz@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

Cool cool, we’re cool. I get a little triggered when I hear people say that NN/DL models are “fancy statistics”—it’s not the first time.

In what seems like another lifetime ago, my first engineering job was as a process engineer for an refinery-scale continuous chromatography unit in hydrocarbon refining. Fuck that industry, but there’s some really cool tech there nevertheless. Anyway when I was first learning the process, the technician I was learning from called it a series of “fancy filters” and that triggered me too—adsorption is a really fascinating chemical process that uses a lot of math and physics to finely-tune for desired purity, flowrate, etc. and to diminish it as “fancy filtration”!!!

He wasn’t wrong, you’re not either; but it’s definitely more nuanced than that. :)

Engineers are gonna nerd out about stuff. It’s a natural law, I think.

[–] ricdeh@lemmy.world -4 points 2 months ago

Friend, your brain is also just a neural network. "Advanced statistics" are happening in your head every second. There is nothing exceptional about humans, save for the immense complexity of our neural network.