this post was submitted on 25 Jan 2024
550 points (97.1% liked)

Greentext

4482 readers
2380 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Diplomjodler@feddit.de 31 points 10 months ago* (last edited 10 months ago) (4 children)

Most questions on SO these days are very specific so I doubt ChatGPT would be able to come up with good answers for those. All the easy questions have been answered long ago.

[–] TheActualDevil@sffa.community 13 points 10 months ago (3 children)

Especially since ChatGPT can't think of a new answer, right? It's working off data that's already somewhere online. It's just using predictive text based to determine the next word based on what users have typed. So most of these answers people get from "AI" are out there for these people to get from real people.

[–] Japan_50@sh.itjust.works 8 points 10 months ago (3 children)

I don't know why you're getting down voted. That is how it works to my understanding (as a layperson). It was fed training data and is very good at predictive text. I don't think it can take concepts it's learned and apply them in novel ways.

[–] danielbln@lemmy.world 4 points 10 months ago (1 children)
[–] jpeps@lemmy.world 2 points 10 months ago

This is hilarious but I don't think fully answers the question. This is a good example of something novel that GPT can do, ie manipulating language according to new rules to create rhythm and rhymes.

However, to give a more over the top example: if you removed all mention of planes from its corpus, leaving only information on air resistance and materials science, and then asked it for the best way to cross the Atlantic, it would never invent a plane for you.

[–] jacksilver@lemmy.world 2 points 10 months ago (1 children)

Even if it could, there are a lot of APIs or documentation that it hasn't been trained on enough or at all to be able to answer. The models can, at least currently, only contain so much information, so the more specific or detailed the response you need, the worse it'll do.

[–] Blue_Morpho@lemmy.world 1 points 10 months ago

Deciding what to write next based on what it just wrote is reasoning. So saying "it's just predicting the next word" is very dismissive if you haven't used it.

My personal experience was I spent hours googling a for a script. I gave up and typed my problem into chatgpt. It gave working code in seconds.

It wasn't just cutting and pasting what was already on Google.

[–] DavidGarcia@feddit.nl 2 points 10 months ago

I swear, uninformed people who underestimate AI will be the death of us

[–] Honytawk@lemmy.zip 1 points 10 months ago

Good thing every single programming line is already documented somewhere.

It doesn't need to think of new answers.

[–] LesserAbe@lemmy.world 9 points 10 months ago

I disagree. I use chatgpt all the time where I'll tell it "here's my block of code" then "here's the error message I'm getting, how should I resolve this?" I could easily see it working for stack exchange questions. Chatgpt is useful because it's able to answer specific questions.

Of course there is some percentage of the time where it's completely wrong, but I'd put that under 20% for the questions I ask it. And you can tell it's wrong because the solution doesn't work, but if I'm not familiar with the subject matter I could waste a lot of time before I figure out why it's wrong.

[–] idunnololz@lemmy.world 2 points 10 months ago

If you look at new questions asked, there are a lot of easy to answer, low quality questions.

[–] ook_the_librarian@lemmy.world 2 points 10 months ago (1 children)

Which is probably how chatgpt learned to code in the first place.

[–] Diplomjodler@feddit.de 4 points 10 months ago (1 children)
[–] ook_the_librarian@lemmy.world 3 points 10 months ago (1 children)

You haven't learned to add "probably" when you're sure of something on lemmy?

[–] Diplomjodler@feddit.de 5 points 10 months ago