this post was submitted on 21 Sep 2024
84 points (71.4% liked)

Technology

59656 readers
4115 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Please remove it if unallowed

I see alot of people in here who get mad at AI generated code and I am wondering why. I wrote a couple of bash scripts with the help of chatGPT and if anything, I think its great.

Now, I obviously didnt tell it to write the entire code by itself. That would be a horrible idea, instead, I would ask it questions along the way and test its output before putting it in my scripts.

I am fairly competent in writing programs. I know how and when to use arrays, loops, functions, conditionals, etc. I just dont know anything about bash's syntax. Now, I could have used any other languages I knew but chose bash because it made the most sense, that bash is shipped with most linux distros out of the box and one does not have to install another interpreter/compiler for another language. I dont like Bash because of its, dare I say weird syntax but it made the most sense for my purpose so I chose it. Also I have not written anything of this complexity before in Bash, just a bunch of commands in multiple seperate lines so that I dont have to type those one after another. But this one required many rather advanced features. I was not motivated to learn Bash, I just wanted to put my idea into action.

I did start with internet search. But guides I found were lacking. I could not find how to pass values into the function and return from a function easily, or removing trailing slash from directory path or how to loop over array or how to catch errors that occured in previous command or how to seperate letter and number from a string, etc.

That is where chatGPT helped greatly. I would ask chatGPT to write these pieces of code whenever I encountered them, then test its code with various input to see if it works as expected. If not, I would ask it again with what case failed and it would revise the code before I put it in my scripts.

Thanks to chatGPT, someone who has 0 knowledge about bash can write bash easily and quickly that is fairly advanced. I dont think it would take this quick to write what I wrote if I had to do it the old fashioned way, I would eventually write it but it would take far too long. Thanks to chatGPT I can just write all this quickly and forget about it. If I want to learn Bash and am motivated, I would certainly take time to learn it in a nice way.

What do you think? What negative experience do you have with AI chatbots that made you hate them?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] wewbull@feddit.uk 4 points 2 months ago (2 children)
load more comments (2 replies)
[–] essteeyou@lemmy.world 4 points 2 months ago

I use it as a time-saving device. The hardest part is spotting when it's not actually saving you time, but costing you time in back-and-forth over some little bug. I'm often better off fixing it myself when it gets stuck.

I find it's just like having another developer to bounce ideas off. I don't want it to produce 10k lines of code at a time, I want it to be digestible so I can tell if it's correct.

[–] NeoNachtwaechter@lemmy.world 4 points 2 months ago

Now, I obviously didnt tell it to write the entire code by itself. [...]

I am fairly competent in writing programs.

Go ahead using it. You are safe.

[–] Soup@lemmy.cafe 3 points 2 months ago

Because despite how easy it is to dupe people into thinking your methods are altruistic- AI exists to save money by eradicating jobs.

AI is the enemy. No matter how you frame it.

[–] madsen@lemmy.world 3 points 2 months ago

but chose bash because it made the most sense, that bash is shipped with most linux distros out of the box and one does not have to install another interpreter/compiler for another language.

Last time I checked (because I was writing Bash scripts based on the same assumption), Python was actually present on more Linux systems out of the box than Bash.

[–] Banked3-Visa9@fedia.io 2 points 2 months ago

People are not "mad" or "hate" about AI, more like "concerned."

[–] obbeel@lemmy.eco.br 2 points 2 months ago* (last edited 2 months ago)

I have worked with somewhat large codebases before using LLMs. You can ask the LLM to point a specific problem and give it the context. I honestly don't see myself as capable without a LLM. And it is a good teacher. I learn much from using LLMs. No free advertisement for any of the suppliers here, but they are just useful.

You get access to information you can't find on any place of the Web. There is a large structural bad reaction to it, but it is useful.

(Edit) Also, I would like to add that people who said that questions won't be asked anymore seemingly never tried getting answers online in a discussion forum - people are viciously ill-tempered when answering.

With a LLM, you can just bother it endlessly and learn more about the world while you do it.

[–] lvxferre@mander.xyz 1 points 2 months ago

[NB: I'm no programmer. I can write some few lines of bash because Linux, I'm just relaying what I've read. I do use those bots but for something else - translation aid.]

The reasons that I've seen programmers complaining about LLM chatbots are:

  1. concerns that AI will make human programmers obsolete
  2. concerns that AI will reduce the market for human programmers
  3. concerns about the copyright of the AI output
  4. concerns about code quality (e.g. it assumes libraries and functions out of thin air)
  5. concerns about the environmental impact of AI

In my opinion the first one is babble, the third one is complicated, but the other three are sensible.

My workplace of 5 employees and 2 owners have embraced it as an additional tool.

We have Copilot inside Visual studio professional and it’s a great time saver. We have a lot of boiler plate code that it can learn from and why would i want to waste valuable time writing the same things over and over. If every list page follows the same pattern then it’s boring we are paid to solve problems not just write the same things.

We even have a tool powered by AI made by the owner which we can type commands and it will scaffold all our boiler plate. Or it can watch the project and if I update a model it will do the mutations and queries in c# set up the graphql layer and then implement some views in react typescript.

[–] tal 1 points 2 months ago* (last edited 2 months ago)

I don't think that the current approaches being used by generative AIs are sufficient to reliably produce correct code; I think that they're more-amenable to human-consumable output (and even there, I'm much more enthusiastic about their use for images than text, as things stand). A human needs approximately-correct material to cue their brain; CPUs are more particular.

We'll probably get there, in the same sense that we can ultimately produce human-level AI for anything, but I think that it'll entail higher-level reasoning about a problem, which present generative text approaches don't do.

I did start with internet search....I could not find how to pass values into the function and return from a function easily,

So, now, this I have a hard time with.

When I search for "pass value function bash", this is the first page I get, which clearly shows an example:

https://stackoverflow.com/questions/6212219/passing-parameters-to-a-bash-function

This isn't where I'd consider generative AI to be a useful example; it's something that there will be existing material already readily-available via a search.

The other issue with using generative AI for coding is that for taking pre-existing code for common tasks and using it in multiple programs, we already have an approach: use libraries. That way code gets maintained and such, but doesn't need to be reimplemented by humans over-and-over.

Say someone says "I need linked-list code". Okay, I mean, that's a pretty common, plain Jane thing to need.

But if you use a library, and there's a bug in that code, and it gets fixed, then the bugfix propagates when you update to a newer library. If you generate a linked-list implementation, even if you wind up with working linked-list code at the end, then that isn't gonna happen.

load more comments
view more: ‹ prev next ›