So far one of the best use cases for AI in software engineering has been identifying idiots and sociopaths.
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
Joke's on AI. It's harder to stop us from outing ourselves.
Oracle has a product called Oracle Policy Automation (OPA) that it sells as "you can write the rules in plain English in MS Word documents, you don't need developers". I worked for an insurance organization where the business side bought OPA without consulting IT, hoping they wouldn't have to deal with developers. It totally failed because it doesn't matter that they get to write "plain English" in Word documents. They still lack the structured, formal thinking to deal with anything except the happiest of happy paths.
The important difference between a developer and a non-developer isn't the ability to understand the syntax of a programming language. It's the willingness and ability to formalize and crystallize requirements and think about all the edge cases. As an architect/programmer when I talk to the business side, they get bored and lose interest from all my questions about what they actually want.
Edge cases are for teams that have internal testing AND care about quality.
A quick easy way to know if your new job is or isn't one of those, is when you open a 3 year project and find no unit tests.
It's happened a few times in my career where people tell me I'll be obsolete, but it's always been some company hyping their new product and suits frothing at the prospect of not having to pay me anymore.
So far they're like 0 for 8 or so.
Now I will say the goalposts move. What I'm doing now is for sure not what I was doing 10 years ago. I'm definitely heavier in devops and infra than where I was before (ironic because they said we'd never have to worry about that stuff again if we moved to the cloud). AI is still basically machine learning, just in a while loop, so I've spent time learning that. So, in a way, yes we're obsolete in the sense that if I was the same engineer I was 10 years ago I wouldn't be worth nearly this much, I had to grow and evolve with technology.
"Don't worry the salesman told me I would not need an infra team anymore ! Also do you know what is a vpc ?"
Oh don't worry, you can just pay <> 30x what you were your infra team before, or if that's too expensive just pay a consulting form 10x what you would have before. Then they can go dine on steaks while they have the same infra guy you had hired before doing the same stuff just now in "teh cloud", but making less money
DevOps was a lie pushed on devs to make them become sysadmins, unfortunately.
And DBAs. I'm currently working on a project where I said from the very start, I can set up this DB in k8s and I can get it to work decently, but I have neither the knowledge nor the time to get it right. Please give me someone who knows how this works.
No, don't worry, it'll be fine, we don't need that, this kuverneles thing I keep hearing about handles that!!!
Six months of hard contact with the enemy on production later:
Well, we're currently looking for someone who actually knows how DBs work, because we have one of those issues that would cost a proper DBA 5min and me 5 months.
I feel like there is a lost art of DBAs, where in their mystical knowledge rests how to make perfect cheap and scalable databases, and business cast them away because "Why not pay Google twice that amount?"
It was a fancy lie about their spare time, but especially in dotcom, there IS no spare time to learn architecture.
What I've seen of dev AND ops is that their knowledge is focused well on their own things. And when it comes to the other half of devops they just want the shortest path back to doing their thing. This has caused absolute princess devs to be nearly screaming about the hassle of security and change control and infrastructure and proper code deployment and testing and ... Well, a lot of things.
It doesn't pay to have people learning to half-ass dev because ops is your thing. You need advocacy on both sides of that line, still.
@scrubbles
cool
but it's always been some company hyping their new product and suits frothing at the prospect of not having to pay me anymore
i half expected it, after all it's what's happening right now
What I'm doing now is for sure not what I was doing 10 years ago.
that's right, i guess some aspects of programming have really been made obsolete
some aspects of programming have really been made obsolete
I'd agree that some specifics have been made obsolete. Some habits and routines are currently being ignored or skipped, but the amount of skill that's gone away is very small.
As mentioned before, we downsized brutally after Y2K. The people most affected were the highest-paid who weren't the best code-grinders, and these were the documenters, the programme people, and the mentor types. We lost our guides, our structure, and our historians. We've been growing again like feral children rebuilding society from the wasteland like it's Mad Max, and there's a LOT of the Why that we either don't know, that we ignore, or that we skip in the interests of (insert manufactured urgency here).
We are re-learning some of the whys, but we haven't yet seen the half-assedry chickens come home to roost on that. The symptoms are there: Boeing's Gilligan's Island in Space, supply-chain sploits in waves, personal information lost weekly, all these things that are clipboard hassles we stopped doing that pelrevent massively expensive things later.
Crowdstrike may die now, mainly because they were marauding leopards we allowed to eat our face. Solarwinds before that, same issue but they seem to be okay. There are dozens of ohShit moments that could lead to similarly preventable problems, that we knew not to do ... once.
Well get there again but we'll be rediscovering a lot of what some techbro will claim is obsolete, old-practice, too-cautious, hand-wringing in our neu and moderne go-hard/break-lives paradigm.
Salesforce advertised “No more developers” for awhile in the mid 2010s. It was great fun trying to clean up the mess all the “not programmers” made of those systems. I really hate Salesforce. They must have some of the best sales people on the planet.
And now job boards are full of ads for 'salesforce developers' that pay ridiculous amounts because nobody really wants to work on salesforce.
I know I’ve chosen to take lower paid jobs rather than work on Salesforce.
Zero-code has been about to make us all redundant for about five decades running so far.
Programmers become obsolete when they stop evolving with technology
If a tool were created that properly converted an UML diagram into a project without any need for code, all the programmers that lost their job to this tool would then be hired by the company that offered it, in order to give maintenance and support to everything the customers want in their programs.
It would be removing programmers from they payroll of some companies but they would still be working for them, just further down in the chain.
The same is true for AI. If AI could completely replace programmers in some area, it would need a lot of programmers itself to keep dealing with all the edge cases that would show up from being used everywhere that a programmer was needed before.
Besides. Somebody has to convert customer needs into the diagram. Account for what they’re not saying, etc.
That’s the real essential skill in software dev, not spitting out lines of code.
Yup. Business logic for things that cost millions or billions should not be run by an approximation machine.
To be fair, a lot of roles simply disappeared over the years.
Developers today are much more productive than 30 years ago, mostly because someone automated the boring parts away.
A modern developer can spin up a simple crud app including infrastructure in a day or so. That's much much more productive than 1995. We just cram a lot more of the world into software, so we need 20x the amount of developers we needed back then.
Was before my time, but iirc C and other (then) high level languages were supposedly able to put programmers out of jobs.
SQL was explicitly designed to allow "normal humans" to query the database. Nowadays even "normal developers" aren't able to use it properly.
sometimes, it feels like managers hate engineers, and are constantly plotting their replacement. maybe its because it hurts their ego to know that the engineers they manage worked harder to get there and deserve a higher salary.
or else, it could be office politics. anyone who can claim to have removed an entire department from payroll is due a huge raise.
sometimes, it feels like managers hate engineers
They hate engineers because the engineers ask difficult questions that somebody needs to answer in order to really automate a process, and they take the time necessary to do so.
I don't think it's just managers saying hey we could automate such and such a thing away. It's human nature to think "how could I improve this" which almost immediately leads to "if I get this right it could mean no work at all"
that explains why the idea to replace engineers would enter peoples minds, but not why they would try so, so hard to get people to believe it.
Every business's biggest expense is labor. Skilled labor costs more. The people in charge like it when you save money.
I think it's wrong. But only because the interests of the people who own the machines and businesses diverge from the worker's interests. I'd like to see more worker cooperatives. If the workers own the machines, then it's good when things are automated.
I also don't believe anything will ever be truly automated, or that it's a good idea to try.
All that to say we don't have to resort to an explanation of "managers must hate engineers" to understand why they would want to eliminate positions.
Rational Rose etc. could generate code from UML diagrams, then you "only" needed architects.
In reality it only gave a little help during the design phase, as soon as someone touches the generated code, you have to manually merge changes to UML.
I had to learn how to use that in the military, used to call it crashinal rose
"AI" is just another productivity tool, copilot let's you remove some of the tedious patterned work you do, like writing all those asserts in Unit tests, it's decent at guessing html structures too.
So basically it makes a developer faster, but then so do stuff like a good IDE, good plugins for your workflow, etc.
i saw somewhere an interesting take, even if AI could generate all the code for all the edge cases, you'd still need people to translate what business wants for the AI to understand properly.
Writing code is already a small part of a developers job, completely eliminating it won't eliminate a developers job.
Even better quote, I love using this one.
"So, with AI writing code for us, all we need is an unambiguous way to define, what all our business requirements are for the software, what all the edge cases are, and how it should handle them."
"We in the industry call that 'code.'"
That's fun, I'm stealing that
Dude I WISH an AI would do all the dumb AWS crap for me so that I could just hang out and build React frontends all day
The thing that made me laugh when I saw the article that OP mentions is that it was coming from AWS.
In my testing AWS's Titan AI is the least useful for figuring out how to do things in AWS. It's so terrible that Amazon just announced they're using Claude for Alexa's upcoming "AI" features.
Fortran was supposed to replace computers (people). Then the computers became Fortran coders.
If only we lived in a world so simple as to allow the whims of managers, customers and third parties to be completely definable in UML
It's not happening, ever. Someone has to build the AI after all
Was thinking that may be why it's taking so long. It's akin to knowing you have to train your human replacement before you're fired. You can't possibly teach a program or human everything you know in a limited time; and a great many don't want to do.
The first time I heard about programming being obsolete was when I was taught UML in university. That was over almost 15 years ago and it didn't happen, if anything programmers now also had to know UML, which isn't all that bad but it definitely didn't replace anything, it's just useful for designing and documenting projects.
I also heard from colleagues that in the 80s and 90s people said that SQL was supposed to be used by users directly, making (some) programming obsolete.
Now AI bullshit claims to be making programming obsolete. I won't hold my breath.
Same here only it was 20 years ago. UML professor was convinced it would replace programming.
The earliest I can think of (from personal experience) is 4GL languages; the early low-code platforms that first started to get traction in the early 80s. They wouldn't have replaced programmers but some thought/hoped they would usher in an age of "low skill" programmers that companies could get away with paying minimum wage to.
So far my experience with ai is it cannot evaluate the quality of the data it uses to any significant degree. As such it can summarize which is convenient for searching and give examples but ultimately you have to correct its mistakes and know enough to do so. There is some savings for a programmer in the sense you might be able to get some rough scaffolding and its a bit eaiser to identify relevant search links but I don't see it replacing developers. It definitely allows one to do more though or even increase the quality. One really great thing it can do is auto commenting of the code which does not need as much improvement as actual code and makes it more likely for you to do the task (both because it does it and because it causes you to go. no don't explain it like that). Is similarly helps with documentation. I doubt it could more than double productivity though. At least as how it stands now. Im not sure it can do much better without becoming general ai.
One really great thing it can do is auto commenting of the code
But then it only comments the 'what', it cannot possibly know the 'why'. I know, some devs disagree on that, but personally, I would rather not have what-comments in my code.
- can AI replace the job of a real programmer, or a team of software engineers? Probably not for a long time.
- can manager abuse the fantasy that they could get rid of those pesky engineers that dare telling them something is impossible? Yes totally. If they believe adding an AI tool to a team justifies a 200% increase in productivity. Some managers will fire people against all metrics and evidence. Calling that move a success. Same occurred when they try to outsource code to cheaper teams.
I know this is c/programmerhumor but I'll take a stab at the question. If I may broaden the question to include collectively the set of software engineers, programmers, and (from a mainframe era) operators -- but will still use "programmers" for brevity -- then we can find examples of all sorts of other roles being taken over by computers or subsumed as part of a different worker's job description. So it shouldn't really be surprising that the job of programmer would also be partially offloaded.
The classic example of computer-induced obsolescence is the job of typist, where a large organization would employ staff to operate typewriters to convert hand-written memos into typed documents. Helped by the availability of word processors -- no, not the software but a standalone appliance -- and then the personal computer, the expectation moved to where knowledge workers have to type their own documents.
If we look to some of the earliest analog computers, built to compute differential equations such as for weather and flow analysis, a small team of people would be needed to operate and interpret the results for the research staff. But nowadays, researchers are expected to crunch their own numbers, possibly aided by a statistics or data analyst expert, but they're still working in R or Python, as opposed to a dedicated person or team that sets up the analysis program.
In that sense, the job of setting up tasks to run on a computer -- that is, the old definition of "programming" the machine -- has moved to the users. But alleviating the burden on programmers isn't always going to be viewed as obsolescence. Otherwise, we'd say that tab-complete is making human-typing obsolete lol
@litchralee
Thank you!
i didn't expect serious answers here, but this was a nice read,
so the various jobs around computers were kind of obsoleted, but the job description just shifted and the title remained valid most of the times,
now i'm interested to see what we'll do 20 years from now rather than just being annoyed by the "don't learn ${X}, it's outdated" guys
In Neolithic era I guess?