this post was submitted on 13 Mar 2024
882 points (99.8% liked)

Programmer Humor

32558 readers
523 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] alexdeathway@programming.dev 14 points 8 months ago (3 children)

Does anybody mind explaining, how this might have happened?

[–] manny_stillwagon@mander.xyz 72 points 8 months ago (2 children)

Copilot is a LLM. So it's just predicting what should come next, word by word, based off the data its been fed. It has no concept of whether or not its answer makes sense.

So if you've scraped a bunch of open source github projects that this guy has worked on, he probably has a lot of TODOs assigned to him in various projects. When Copilot sees you typing "TODO(" it tries to predict what the nextthing you're going to type is. And a common thing to follow "TODO(" in it's data set is this guy's username, so it goes ahead and suggests it, whether or not the guy is actually on the project and suggesting him would make any sort of sense.

[–] pearsaltchocolatebar@discuss.online 8 points 8 months ago (3 children)

You can absolutely add constraints to control for hallucinations. Copilot apparently doesn't have enough, though.

[–] kadu@lemmy.world 42 points 8 months ago (1 children)

If GitHub Copilot is anything like Windows Copilot, I can't say I'm surprised.

"Please minimize all my windows"

"Windows are glass panes invented by Michael Jackson in imperial China, during the invasion of the southern sea. Sources 1 2 3"

[–] Darkassassin07@lemmy.ca 20 points 8 months ago

Lmao. That's even better when you consider the copilot button replaced the 'show desktop' (ie 'minimize all my windows') button.

[–] shootwhatsmyname@lemm.ee 16 points 8 months ago

My guess is that Copilot was using a ton of other lines as context, so in that specific case his name was a more likely match for the next characters

[–] jherazob@beehaw.org 1 points 8 months ago

No matter how many constraints you add, it's never enough, that's the weakness of a model that only knows language and nothing else

[–] alexdeathway@programming.dev 2 points 8 months ago

I thought it synced some requests and assigned projects to another user (Saw an ad about github Copilot managing issues and writing PR descriptions sometime ago)

[–] dojan@lemmy.world 16 points 8 months ago* (last edited 8 months ago) (1 children)

It’s no different from GPT knowing the plot of Aliens or who played the main role in Matilda.

It's seen enough code to recognise the pattern, it knows an author name goes in there, and Phil Nash is likely a prolific enough author that it just plopped his name in there. It's not intelligence, just patterns.

[–] planish@sh.itjust.works 5 points 8 months ago

"Yeah this sounds like a Phil Nash sort of problem, I'll just stick him in here."

[–] ilinamorato@lemmy.world 15 points 8 months ago

The other answers are great, but if I were to be a bit more laconic:

Copilot is spicy autocorrect. It autocorrected that todo to insert that guy's name because he gets a lot of todos.