this post was submitted on 09 Feb 2025
5 points (77.8% liked)
Programming
264 readers
2 users here now
Welcome to the Lemmygrad programming community! This is a space where programmers of all levels can discuss programming, ask for help with problems, and share their personal programming projects with others.
Rules
- Respect all users, regardless of their level of knowledge in programming. We're here to learn and help each other improve.
- Keep posts relevant to programming and related topics.
- Respect people's personal preferences. If you disagree with someone's choice of programming language, method of formatting code, or anything else, don't attack the poster. Genuine criticism is fine, but personal attacks are not.
- In order to promote breaks from typing, all code snippets must be photos of code written on paper.
Just kidding :), please use proper markdown code blocks.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think we're already largely there. Nobody really knows how the full computing stack works anymore. The whole thing is just too big to fit in your head. So, there is a lot of software out there that's already effectively a black box. There's a whole joke how large legacy systems are basically like generation ships where new devs have no idea how or why the system was built that way, and they just plug holes as they go.
However, even if people forget how to write code, it's not like it's a skill that can't be learned if it becomes needed again. And if we do get to the point where LLMs are good enough that people forget how to write code, then it means the LLMs just become the way people write code. I don't see how it's different from people who only know how to use a high level language today. A Js dev will not know how to work with pointers, do manual memory management and so on. You can even take it up a level and look at it from a perspective of a non technical person asking a developer to write a program for them. They're already in this exact scenario, and that's vast majority of the population.
And given the specification writing approach I described, I don't actually see that much of a problem with the code being a black box. You would basically create contracts and LLM will fill them, and this way you have some guarantees about the behavior of the system.
It's possible people start developing mysticism about software, but at this point most people already treat technology like magic. I expect there will always be people who have an inclination towards a scientific view of the world, and who enjoy understanding how things work. I don't think LLMs are going to change that.
Personally, I kind of see a synthesis between AI tools and humans going forward. We'll be using this tech augment our abilities, and we'll just focus on solving bigger problems together. I don't expect there's going to be some sort of intellectual collapse, rather the opposite could happen where people start tacking problems on the scale that seems unimaginable today.