this post was submitted on 12 Jun 2023
9 points (100.0% liked)

Chat

7499 readers
6 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I really like how products like chatgpt can make life easier and more efficient, especially for programmers. However im also kind of afraid of these projects centralised nature. Do you think there is a way to avoid the risks of smaller companies and individuals being reliant on a couple of huge companies for writing code, etc. thus exposing confidential information about their products?

you are viewing a single comment's thread
view the rest of the comments
[–] fiasco@possumpat.io 3 points 1 year ago

It's funny to me that people use deep learning to generate code... I thought it was commonly understood that debugging code is more difficult than writing it, and throwing in randomly generated code puts you in the position of having to debug code that was written by—well, by nobody at all.

Anyway, I think the bigger risk of deep learning models controlled by large corporations is that they're more concerned with brand image than with reality. You can already see this with ChatGPT: its model calibration has been aggressively sanitized, to the point that you have to fight to get it to generate anything even remotely interesting.