this post was submitted on 16 Mar 2025
831 points (98.8% liked)

Programmer Humor

21564 readers
2252 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Psaldorn@lemmy.world 188 points 20 hours ago (2 children)

From the same group that doesn't understand joins and thinks nobody uses SQL this is hardly surprising .

Probably got an LLM running locally and asking it to get data which is then running 10 level deep sub queries to achieve what 2 inner joins would in a fraction of the time.

[–] _stranger_@lemmy.world 76 points 18 hours ago* (last edited 18 hours ago) (3 children)

You're giving this person a lot of credit. It's probably all in the same table and this idiot is probably doing something like a for-loop over an integer range (the length of the table) where it pulls the entire table down every iteration of the loop, dumps it to a local file, and then uses plain text search or some really bad regex's to find the data they're looking for.

[–] morbidcactus@lemmy.ca 33 points 18 hours ago

Considering that is nearly exactly some of the answers I've received during the technical part of interviews for jr data eng, you're probably not far off.

Shit I've seen solutions done up that look like that, fighting the optimiser every step (amongst other things)

[–] indepndnt@lemmy.world 12 points 16 hours ago

I think you're still giving them too much credit with the for loop and regex and everything. I'm thinking they exported something to Excel, got 60k rows, then tried to add a lookup formula to them. Since you know, they don't use SQL. I've done ridiculous things like that in Excel, and it can get so busy that it slows down your whole computer, which I can imagine someone could interpret as their "hard drive overheating".

[–] makingStuffForFun@lemmy.ml 5 points 17 hours ago* (last edited 17 hours ago)

I have to admit I still have some legacy code that does that.

Then I found pandas. Life changed for the better.

Now I have lots if old code that I'll update, "one day".

However, even my old code, terrible as it is, does not overheat anything, and can process massively larger sets of data than 60,000 rows without any issue except poor efficiency.

[–] Korhaka@sopuli.xyz 1 points 16 hours ago

They don't understand joins? How..