this post was submitted on 10 Aug 2023
1093 points (97.4% liked)

Programmer Humor

32453 readers
738 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] saltesc@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (1 children)

I spent 45 mins with ChatGPT trying to give me the quick resolve for something querying with M.

It ended with me telling ChatGPT that if it worked for me, it would be fired because it kept trying to reoptimise my query, resulting in syntax and load errors, then "fixing" them by ignoring my query's criteria.

I ended up going old school and taking an extra 30 mins to just figure it out myself. Now that I know how it's done, it's surprisingly easy to understand.

So I took that as a compliment; or ChatGPT just sucks at PowerQuery.

It probably learned, though. If anyone has transform queries around multi-level filtering criteria and ChatGPT helps, that's because of my suffering.

[–] Ceon@kbin.social 3 points 1 year ago (2 children)

Does chatGPT really learn from user inputs? I thought it was always restarting from the same base

[–] JuxtaposedJaguar@lemmy.ml 4 points 1 year ago

In each session, the last several thousand words (from the user and AI) are kept in a context buffer to be used as additional inputs for the neural network. But I don't think ChatGPT lets you choose the AI's responses for that buffer, so you can't really "train" it in any sense of the word. If you want that functionality, use LLaMa.

[–] usrtrv@lemmy.ml 4 points 1 year ago

It will eventually incorporate user inputs in the model. So yes it won't learn in real time from other users, but at some point those inputs will be fed back into itself.