this post was submitted on 27 Mar 2024
38 points (100.0% liked)

Technology

37719 readers
78 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] scrubbles@poptalk.scrubbles.tech 26 points 7 months ago* (last edited 7 months ago) (20 children)

There are a handful of use cases I've seen generative AI be useful

  • Searching
  • Partner programming (how do I...)
  • Chatbot (as a novelty thought, gets boring quick, and the only great ways of controlling it in a way that would be safe for business is by adding more ai)

And a few more probably.

I spent about 6 months deep diving into how it all worked. I was having dread that it would take my job and was determined to learn about it. What I learned is that there are many many serious pitfalls that seem to be more or less ignored or unknown by businesses and people covering it.

I won't say it's as bad as blockchain, there are usages for it, but the hype is pretty damn close. Business thinking it will save them billions and they can start getting rid of developers. Tech bros lining up to say it's going to bring on the singularity.

Eh. It's cool. I wouldn't say it's going to bring the second coming of Jesus.

[–] Thorry84@feddit.nl 8 points 7 months ago (3 children)

There may be exceptions but everything I've seen from AI programming is next level trash. It's like copy pasting from Stack Overflow without the thousand comments all around it saying DO NOT DO THIS!

When ChatGPT was just released to the general public I wanted to try it out. I had it write a script to handle some simple parsing of network log files. I was having some intermittent issue with my home network I couldn't figure out, so I had logged a lot of data and was hoping to figure out the issue. But I needed to filter out all the routine stuff that would be just noise in the background. I could have written it myself in about an hour, but figured hey maybe ChatGPT can help me bang it out in a couple of minutes.

The code it wrote looked at a glance to be very good and I was impressed. However as I read it, it turned out to be total nonsense. It was using variables and declaring them after. Halfway the script it seemed to have switched to a completely different approach leaving some sort of weird hybrid between the two. At one point it had just inserted pseudo code instead of actual functional code. Every attempt to get it to fix it's issues just made it worse. In the end I just wrote the script myself.

I've seen examples from other people who attempted to use it and it's just bad. It's like having a junior programmer high on weed writing your code, checking it and fixing it takes more time than just writing the code itself.

Then there's the issue of copyright, a lot of the training data wasn't licensed and stuff like Github Copilot want to add your data to it's training set if you want to use it. That's not OK on many levels and not even possible for people working on corporate codebases.

A lot of programmers work on big code bases, with things like best practices and code standards. Not only does the AI not know the codebase and thus wouldn't know how to do a lot of stuff in that codebase, it also doesn't know about the best practices and code standards. So for those kinds of situations it isn't useful.

I feel like people ask it to do some first year student programming tutorial tasks and the result looks somewhat like what one would expect and conclude the thing can actually write code. It really can't in reality and probably shouldn't even if it could.

[–] gromnar@beehaw.org 1 points 7 months ago

Well said... Thanks for spelling it out!

load more comments (2 replies)
load more comments (18 replies)