The only people who would say this are people that don’t know programming.
LLMs are not going to replace software devs.
This is a most excellent place for technology news and articles.
The only people who would say this are people that don’t know programming.
LLMs are not going to replace software devs.
Wrong, this is also exactly what people selling LLMs to people who can't code would say.
It's this. When boards and non-tech savvy managers start making decisions based on a slick slide deck and a few visuals, enough will bite that people will be laid off. It's already happening.
There may be a reckoning after, but wall street likes it when you cut too deep and then bounce back to the "right" (lower) headcount. Even if you've broken the company and they just don't see the glide path.
It's gonna happen. I hope it's rare. I'd argue it's already happening, but I doubt enough people see it underpinning recent lay offs (yet).
AI as a general concept probably will at some point. But LLMs have all but reached the end of the line and they're not nearly smart enough.
I can see the statement in the same way word processing displaced secretaries.
There used to be two tiers in business. Those who wrote ideas/solutions and those who typed out those ideas into documents to be photocopied and faxed. Now the people who work on problems type their own words and email/slack/teams the information.
In the same way there are programmers who design and solve the problems, and then the coders who take those outlines and make it actually compile.
LLM will disrupt the programmers leaving the problem solvers.
There are still secretaries today. But there aren't vast secretary pools in every business like 50 years ago.
It'll have to improve a magnitude for that effect. Right now it's basically an improved stack overflow.
It’ll replace brain dead CEOs before it replaces programmers.
I'm pretty sure I could write a bot right now that just regurgitates pop science bullshit and how it relates to Line Go Up business philosophy.
Edit: did it, thanks ChatJippity
def main():
# Check if the correct number of arguments are provided
if len(sys.argv) != 2:
print("Usage: python script.py <PopScienceBS>")
sys.exit(1)
# Get the input from the command line
PopScienceBS = sys.argv[1]
# Assign the input variable to the output variable
LineGoUp = PopScienceBS
# Print the output
print(f"Line Go Up if we do: {LineGoUp}")
if __name__ == "__main__":
main()
if lineGoUp {
CollectUnearnedBonus()
} else {
FireSomePeople()
CollectUnearnedBonus()
}
I know just enough about this to confirm that this statement is absolute horseshit
Sounds like the no-ops of a decade ago and cloud will remove the need for infrastructure engineers. 😂🤣😂🤣😂🤣😂😂😂🤣
It isn't that AI will have replaced us in 24 months, it's that we will be enslaved in 24 months. Or in the matrix. Etc.
I'll take "things business people dont understand" for 100$.
No one hires software engineers to code. You're hired to solve problems. All of this AI bullshit has 0 capability to solve your problems, because it can only spit out what it's already ~~stolen from~~ seen somewhere else
Guys that are putting billions of dollars into their AI companies making grand claims about AI replacing everyone in two years. Whoda thunk it
He who knows, does not speak. He who speaks, does not know.
--Lao Tzu...
But coding never was the difficult part. It's understanding a concept, identify a problem and solve it with the possible methods. An AI just makes the coding part faster and gives me options to quicker identify a possible solution. Thankfully there's a never ending pile of projects, issues, todos and stackholder wants, that I don't see how we need less programmers. Maybe we need more to deal with AI, as now people can do a lot more in house instead of outsourcing, but as soon as that threshold is reached, companies will again contact large software companies. If people want to put AI into everything, you need people feeding the AI with company specific data and instruct people to use this AI.
All I see is middle management getting replaced, because instead of a boring meeting, I could just ask an AI.
CEOs without a clue how things work think they know how things work.
I swear if we had no CEOs from today on the only impact would be that we wouldve less gibberish being spoken
If AI could replace anyone... it's those dingbats. I mean, what would you say, in this given example, the CEO does... exactly? Make up random bullshit? AI does that. Write a speech? AI does that. I love how these overpaid people think they can replace the talent but they... they are absolutely required and couldn't possibly be replaced! Talent and AI can't buy and enjoy the extra big yacht, or private jets, or over priced cars, or a giant over sized mansion... no you need people for that.
This will be used as an excuse to try to drive down wages while demanding more responsibilities from developers, even though this is absolute bullshit. However, if they actually follow through with their delusions and push to build platforms on AI-generated trash code, then soon after they'll have to hire people to fix such messes.
If, 24 months from now, most people aren't coding, it'll be because people like him cut jobs to make a quicker buck. Or nickel.
How many times does the public have to learn if the CEO says it, he probably doesn't know what he's talking about. If the devs say it, listen
Of course they won't be; somebody has to debug all the crap AI writes.
Todays news: Rich assholes in suits are idiots and don’t know how their own companies are working. Make sure to share what they’re saying.
Yeah, that's not going to happen.
Yeah writing the code isn't really the hard part. It's knowing what code to write and how to structure it to work with your existing code or potential future code. Knowing where things might break so you can add the correct tests or alerts. Giving time estimates on how long it will take to build the parts of the system and building in phases to meet your teams needs.
A company I used to work for outsourced most of their coding to a company in India. I say most because when the code came back the internal teams anways had to put a bunch of work in to fix it and integrate it with existing systems. I imagine that, if anything, LLMs will just take the place of that overseas coding farm. The code they spit out will still need to be fixed and modified so it works with your existing systems and that work is going to require programmers.
So instead of spending 1 day writing good code, we'll be spending a week debugging shitty code. Great.
It's the same claim when tools like Integromat, WayScript, PureData, vvvv and other VPLs (Visual Programming Languages) started to get some hype. I once worked for a company that strongly believed they'd "retire the need for coding", and my ex-boss was so confident and happy about that... Although VPLs were a practical thing, time is the ruler of truth, and for every dev-related job vacancy I see, they ask some programming language, the written ones (JS, PHP, Python, Ruby, Lua, and so on).
Because if you look closely, deep inside, voila, there's code in anything that is claimed to be no-code! Wow, could anyone imagine that? 🤯 /sarcasm
Everybody talks about AI killing programming jobs, but any developer who has had to use it knows it can’t do anything complex in programming. What it’s really going to replace is program managers, customer reps, makes most of HR obsolete, finance analysts, legal teams, and middle management. This people have very structured, rule based day to days. Getting an AI to write a very customized queuing system in Rust to suit your very specific business needs is nearly impossible. Getting AI to summarize Jira boards, analyze candidates experience, highlight key points of meetings (and obsolete most of them altogether), and gather data on outstanding patents is more in its wheelhouse.
I am starting to see a major uptick in recruiters reaching out to me because companies are starting to realize it was a mistake to stop hiring Software Engineers in the hopes that AI would replace them, but now my skills are going to come at a premium just like everyone else in Software Engineering with skills beyond “put a react app together”
While I highly doubt that becoming true for at least a decade, we can already replace CEOs by AI, you know? (:
https://www.independent.co.uk/tech/ai-ceo-artificial-intelligence-b2302091.html
I feel sorry for all those people in AWS that now have him as a leader...
If generative AI hasn't replaced artists, it won't replaced programmers.
Generative AI is much better at art than coding.
Seriously how can these CEOs of a GPU company not talk to a developer. You have loads of them to interview
Sure, Microsoft is happy to let their AIs scan everyone else’s code., but is anyone aware of any software houses letting AIs scan their in-house code?
Any lawyer worth their salt won’t let AIs anywhere near their company’s proprietary code intil they are positive that AI isn’t going to be blabbing the code out to every one of their competitors.
But of course, IANAL.
Let's assume this is true, just for discussion's sake. Who's going to be writing the prompts to get the code then? Surely someone who can understand the requirements, make sure the code functions, and then test it afterwards. That's a developer.
I seem to recall about 13 years ago when "the cloud" was going to put everyone in IT Ops out of a job. At least according to people who have no idea what the IT department actually does.
"The cloud" certainly had an impact but the one thing it definitely did NOT do was send every system and network admin to the unemployment office. If anything it increased the demand for those kinds of jobs.
I remain unconcerned about my future career prospects.
I'm going to call BS on that unless they are hiding some new models with huge context windows...
For anything that's not boilerplate, you have to type more as a prompt to the AI than just writing it yourself.
Also, if you have a behaviour/variable that is similar to something common, it will stubbornly refuse to do what you want.
I'm curious about what the "upskilling" is supposed to look like, and what's meant by the statement that most execs won't hire a developer without AI skills. Is the idea that everyone needs to know how to put ML models together and train them? Or is it just that everyone employable will need to be able to work with them? There's a big difference.
If you go forward 12 months the AI bubble will have burst. If not sooner.
Most companies who bought into the hype are now (or will be soon) realizing it's nowhere near the ROI they hoped for, that the projects they've been financing are not working out, that forcing their people to use Copilot did not bring significant efficiency gains, and more and more are realizing they've been exchanging private and/or confidential data with Microsoft and boy there's a shitstorm gathering on that front.
Says the person who is primarily paid with Amazon stock, wants to see that stock price rise for their own benefit, and won’t be in that job two years from now to be held accountable. Also, who has never written a kind of code. Yeah…. Ok. 🤮