203
this post was submitted on 25 Feb 2024
203 points (83.7% liked)
Technology
59542 readers
3364 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's just as crazy as saying "We don't need math, because every problem can be described using human language".
In other words, that might be true as long as your problem is not complex enough to be able to be understood using human language.
You want to solve a real problem? It's way more complex with so many moving parts you can't just take LLM to solve it, because that takes an actual understanding of a problem.
Maybe more apt for me would be, “We don’t need to teach math, because we have calculators.” Like…yeah, maybe a lot of people won’t need the vast amount of domain knowledge that exists in programming, but all this stuff originates from human knowledge. If it breaks, what do you do then?
I think someone else in the thread said good programming is about the architecture (maintainable, scalable, robust, secure). Many LLMs are legit black boxes, and it takes humans to understand what’s coming out, why, is it valid.
Even if we have a fancy calculator doing things, there still needs to be people who do math and can check. I’ve worked more with analytics than LLMs, and more times than I can count, the data was bad. You have to validate before everything else, otherwise garbage in, garbage out.
It’s sounds like a poignant quote, but it also feels superficial. Like, something a smart person would say to a crowd to make them say, “Ahh!” but also doesn’t hold water long.
And because they are such black boxes, there's the sector of Explainable AI which attempts to provide transparency.
However, in order to understand data from explainable AI, you still need domain experts that have experience in interpreting what that data means and how to make changes.
It's almost as if any reasonably complex string of operations requires study. And that's what tech marketing forgets. As you said, it all has to come from somewhere.
Ha
If you ever write code for a living first thing you notice is that people can't explain what they need by using natural language ( which is what English, Mandarin etc is), even if they don't need to get into details.
Also, natural language can be vague and confusing. Look at legalese and law statutes. "When it comes to the law, NOTHING is understood!" ‐- Dragline