712
submitted 6 months ago* (last edited 6 months ago) by cypherpunks@lemmy.ml to c/programmerhumor@lemmy.ml
all 45 comments
sorted by: hot top controversial new old
[-] ColdFenix@discuss.tchncs.de 80 points 6 months ago

The trick is to split the code into smaller parts.

This is how I code using ChatGPT:

  1. Have it analyze how to structure the program and then give me the code for the outline with not yet implemented methods and functions.
  2. Have it implement the methods and functions one by one with tests for each one.
  3. I copy the code and test for each method and function before moving on to the next one So that I always have working code.
  4. Despair because my code is working and I have no idea how it works and I have become a machine that just copies code without an original thought of my own.

This works pretty well for me as long as I don't work with obscure frameworks or in large codebases.

[-] HurlingDurling@lemm.ee 20 points 6 months ago

Actually, that's the trick when writing code in general, and also how unit tests help coding an application.

[-] DeathsEmbrace@lemmy.ml 15 points 6 months ago

This is exactly how you forget coding.

[-] Ghostalmedia@lemmy.world 7 points 6 months ago

I guess it’s time to become a PM and spend the rest of life drawing ugly PowerPoint slides.

[-] teichflamme@lemm.ee 2 points 6 months ago

As someone doing management I would kill to have ChatGPT build ugly slides for me

[-] averyminya@beehaw.org 2 points 6 months ago

You'd search it anyway.

[-] Tathas@programming.dev 9 points 6 months ago

To be fair, you're also describing working with other people.

[-] ch00f@lemmy.world 7 points 6 months ago

So my job (electrical engineering) has been pretty stagnant recently (just launched a product, no V2 on the horizon yet), so I've taken my free time to brush up on my skills.

I asked my friend (an EE at Apple) what are some skills that I should acquire to stay relevant. He suggested three things: FPGAs, machine learning, and cloud computing. So far, I've made some inroads on FPGAs.

But I keep hearing about people unironically using chatGPT in professional/productive environments. In your opinion, is it a fun tool for the lazy, or a tool that will be necessary in the future? Will employers in the future be expecting fluency with it?

[-] ColdFenix@discuss.tchncs.de 3 points 6 months ago

Right now it's a good but limited tool if you know how to use it. But it can't really do anything a professional in a given field can't do already. Alhough it may be a bit quicker at certain task there is always a risk of errors sneaking in that can become a headache later.

So right now I don't think it's a necessary tool. In the future I think it will become necessary, but at that point I don't think it will require much skill to use anymore as it will be much better at both understanding and actually accomplishing what you want. Right now the skill in using GPT4 is mostly in being able to work around it's limitations.

Speculation time!

I don't think the point where it will be both necessary and easy to use will be far of tbh. I'm not talking about AGI or anything close to that, but I think all that is necessary for it to reach that point is a version of GPT4 that is consistent over long code generation, is able to better plan out it's work and then follow that plan for a long time.

[-] c0mbatbag3l@lemmy.world 1 points 6 months ago

That's like asking in the early 90's if knowing how to use a search engine will be a required skill.

Without a doubt. Just don't rely on it for your own professional knowledge, use it to get the busywork done and automate where you can. I have virtually replaced my search engine needs with Bing AI when troubleshooting at work because it can find PDF manuals for obscure network hardware faster than I can shift through the first five pages of a Google search. It's also one of those things where the skill of the operator can change the output from garbage to gold. If you can't describe your problem or articulate what you want the solution to look like, then your AI is going to be just as clueless.

I don't know what the future will hold and how much of our white collar workforce will be replaced by AI in the coming decades, but our cloud and automation engineers are not only leveraging LLM models but actively programming and training in-house models on company data. Bottom rung data entry is going the way of the dodo in the next ten years for sure. Programmers will likely see the same change that translators did after translation software was developed, they moved from doing the job themselves to QA'ing the software.

Times are changing but getting onboard with using AI as well as learning how to integrate it will be the next big thing in the IT world. It's not going to replace us anytime soon but it will reduce the workforce as the years go by.

[-] PM_Your_Nudes_Please@lemmy.world 5 points 6 months ago* (last edited 6 months ago)

This is exactly how I use it. Just like with conversations, ChatGPT tends to lose the plot after a while. It starts to “forget” the start of the conversation, and has trouble parsing things. It’s great for the first few paragraphs then begins to drift. So only use it for a few “paragraphs” worth of code at a time.

And as always, you need to make sure that it’s not just pretending to know. It will confidently feed you incorrect information, so you need to double check it occasionally.

[-] Hazzia@discuss.tchncs.de 77 points 6 months ago

Saw an article on getpocket while at work by an alleged programmer that was "mourning the art of coding" because ChatGPT was doing such a good job that his non-coder friend was able to setup a webpage.

To be fair I couldn't tollerate reading past the first paragraph, but it definitely felt like the dude didn't know the difference between funtional code and good code. Like, sure ChatGPT may be able to make a website, but good luck getting it to formulate anything non-generic.

[-] platypode@sh.itjust.works 61 points 6 months ago

I read that one, he literally described himself as mediocre programmer and is excited about gpt as a way for mediocre programmers to be competitive again. I'm sure he's in for a really fun time when he has to find a bug in 12k lines of AI spaghetti he bolted together.

[-] Hazzia@discuss.tchncs.de 14 points 6 months ago

Ahhh that certainly makes more sense. But now I'm kinda pissed at him for acting like some authority in the field up front to say "oh no ChatGPT is just so good" just to then admit that he's closer to the non-programmer side of the scale, but that may just be my residual rage from trying to stomach that article bleeding through.

[-] Gabu@lemmy.ml 7 points 6 months ago

I mean, maybe ChatGPT is better than him, specifically.

[-] Kepabar@startrek.website 3 points 6 months ago

I'm excited for it for the same reasons.

I don't have skill in art or coding.

But AI platforms have let me produce things that work for my personal needs that would be beyond my abilities before starting the project.

[-] Thorned_Rose@kbin.social 47 points 6 months ago

As an ex webdesigner/dev, Squarespace, Weebly and the like killed my income well before ChatGPT did.

[-] RightHandOfIkaros@lemmy.world 48 points 6 months ago* (last edited 6 months ago)

ChatGPT is excellent for suggesting an idea that I haven't thought of. I can then use that as a springboard to write out something that works. Asking it if there are optimizations also can yield quite good results. I find I spend less time debugging on average, but not by a large margin.

[-] Touching_Grass@lemmy.world 16 points 6 months ago

Yea. Sucks that journalist and article/blog writers have convinced me that chatGPT wants to fuck my wife and steal my kidney

[-] shrugal@lemm.ee 27 points 6 months ago

I mean it does, it just can't yet.

[-] Gabu@lemmy.ml 5 points 6 months ago

Nah, it'll fuck your kidney and steal your wife.

[-] 0ops@lemm.ee 2 points 6 months ago
[-] stebo02@sopuli.xyz 45 points 6 months ago

I never copy code from chatgpt. It's not my code and it probably doesn't work. However it is great at making suggestions on how to tackle a problem or how to improve your code. Use ChatGPT like Stack Overflow, with instant replies.

[-] Railcar8095@lemm.ee 17 points 6 months ago

This. I always use that example, ChatGPT is stack overflow or a very eager intern. Review and make test cases

[-] Lucidlethargy@sh.itjust.works 8 points 6 months ago

Man, it's great until it contently feeds you incorrect information. I've been burned far too many times at this point...

[-] stebo02@sopuli.xyz 11 points 6 months ago

That's why you should always verify the information with other sources. Just like information you get from any other website/person. It's not any different.

[-] finestnothing@lemmy.world 5 points 6 months ago

I only verify information I get on the Internet if I don't agree with it or need to use it in an argument, and I'm not about to change

[-] kamenlady@lemmy.world 3 points 6 months ago

Oof, glad I'm not alone

Wait...

[-] Gabu@lemmy.ml 7 points 6 months ago* (last edited 6 months ago)

TBH, if you can't almost instantly figure out why and how ChatGPT suggested bad code, you shouldn't be using it at all - you're out of depth.

It's why I'll gladly use it to suggest markdown or C code, but never for a complex Python library.

[-] c0mbatbag3l@lemmy.world 5 points 6 months ago

Blaming the AI for misinformation is like blaming Google for giving you bad search results.

Learn how to parse the data and fact check it. Usually you can get a hyperlink to the source to see if it's even reasonably trustworthy.

[-] tweeks@feddit.nl 6 points 6 months ago

Plain copy paste without a critical view is not recommended, but it surely provides good pieces of code from time to time. Especially in obscure frameworks/languages, compared to what can be googled.

ChatGPT 4 is a really big difference with 3.5 though. What took me hours together with the 3.5, was fixed in a few minutes with 4.

[-] LesserAbe@lemmy.world 22 points 6 months ago

I just use it for snippets - "here's my function, how would I go about changing x?" Or, "here's my block of code, I'm getting this error, what am I missing?" (I know, I'm fine to share my code but not company code)

[-] SomeBoyo@feddit.de 18 points 6 months ago

You can host a model locally with gpt4all. So using company code shouldn't be a problem, since it wouldn't leave your machine.

[-] brbposting@sh.itjust.works 7 points 6 months ago

Nice.

Readers, note performance won’t match GPT-4. You can see the leaderboard then compare some of the available GPT4All models side-by-side. They may be sufficient for your needs.

[-] Gabu@lemmy.ml 5 points 6 months ago

Generally, same. It's also competent (not good) at boilerplate.

[-] Empathy@beehaw.org 15 points 6 months ago* (last edited 6 months ago)

Co-pilot can write some small very simple functions for me, sometimes saving me the need to look at documentation. It will still often fail at those, in my experience, and will consistently fail at anything more complex.

It will get better, but currently it's only a small help.

[-] vox@sopuli.xyz 4 points 6 months ago

it really helps with the boring parts and takss that require doing the same thing over and over;
for example, i use it to generate mapping functions (like in my (abandoned) gba emulator project, function that maps arm instruction type enum to function pointers was generated almost entirely by chatgpt)

[-] sooper_dooper_roofer@hexbear.net 12 points 6 months ago

reality: the code is written in 5 minutes, and it's never debugged because the developer was fired by management. The buggy code is sent straight into software production

[-] Aabbcc@lemm.ee 11 points 6 months ago

I've been having a great time with copilot

[-] CanadaPlus@lemmy.sdf.org 5 points 6 months ago

Yeah, that's the way to go. Copilot or similar to automate the simple stuff, while you still do all the architecting and check whatever it suggests.

[-] regbin_@lemmy.world 9 points 6 months ago

I love LLMs for coming up with patterns to solve the problem but that's about it. I'll do the implementation myself.

[-] AMillionNames@sh.itjust.works 7 points 6 months ago

Sounds like job security to me.

[-] auf@lemmy.ml 6 points 6 months ago

It seems that you gotta learn prompt engineering

this post was submitted on 06 Dec 2023
712 points (97.0% liked)

Programmer Humor

31217 readers
38 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 4 years ago
MODERATORS