this post was submitted on 14 Mar 2025
299 points (96.9% liked)

Technology

66465 readers
4003 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RedFrank24@lemmy.world 14 points 1 day ago* (last edited 1 day ago) (1 children)

I have certainly found that to be the case with developers working with me. They run into a small problem so they instantly go to Copilot and just paste what it says the answer is. Then, because they don't know what they're asking or fully understand the problem, they can't comprehend the answer either. Then later, they come to me having installed a library they don't understand and code that's been hallucinated by Copilot and then ask me why it's not working.

A little bit of stepping back and going "What do I hope to achieve with this?" and "Why do I have to do it this way?" goes a long way. It stops you going down rabbit holes.

Then again, isn't that what people used to do with StackOverflow?

[–] WhatsTheHoldup@lemmy.ml 7 points 1 day ago* (last edited 1 day ago) (1 children)

Then again, isn't that what people used to do with StackOverflow?

Yes, one of the major issues with StackOverflow that answerers complained about a lot was the "XY problem.".

https://meta.stackexchange.com/questions/66377/what-is-the-xy-problem

Where you're trying to do X, but because you're inexperienced you erroneously decide Y must be the solution even though it is a dead end, and then ask people how to do Y instead of X.

ChatGPT drives that problem up to 11 because it has no problems enabling you to focusing on Y far longer than you should be.

[–] superglue@lemmy.dbzer0.com 2 points 1 day ago

I find that interesting because, sometimes AI actually does the opposite for me. It suggests approaches to a problem that I hadn't even considered. But yes, if you push it a certain direction it will certainly lead you along.