this post was submitted on 18 Aug 2023
603 points (96.2% liked)

Programmer Humor

19488 readers
351 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 45 comments
sorted by: hot top controversial new old
[–] pewpew@feddit.it 36 points 1 year ago (1 children)

but the I in AI it's actually a lowrcase L, so it's short for Algorithm

[–] ShortFuse@lemmy.world 18 points 1 year ago* (last edited 1 year ago)

It kinda annoys me that the lowercase L glyph is taller than capital A. I don't mind there being a difference, but cap-height should be taller than lowercase letters.

Illuminati

[–] Granixo@feddit.cl 14 points 1 year ago (4 children)
[–] thisisnotgoingwell@programming.dev 123 points 1 year ago (4 children)

I'm guessing he's saying companies are still using the same human written code, but since AI is sexy right now and is being used to describe even simple programming logic, everything is "powered by AI"

[–] andrew@lemmy.stuart.fun 50 points 1 year ago (1 children)

And in 2013 the key word for marketing was algorithm. The YouTube algorithm. The reddit algorithm. Etc.

[–] thisisnotgoingwell@programming.dev 17 points 1 year ago (1 children)
[–] sundrei@lemmy.sdf.org 6 points 1 year ago (1 children)
[–] Streetdog@sh.itjust.works 4 points 1 year ago

Spray and pray!

[–] fidodo@lemm.ee 9 points 1 year ago

That was true like 5 years ago, but now companies are just irresponsibly calling out to LLMs as a function without proper safe guards instead.

[–] kautau@lemmy.world 7 points 1 year ago (1 children)

Even more likely is that AI’s that write code are trained on human created code. So they aren’t coming up with new, novel ideas to problems in most cases, they are just a far more advanced “copy and paste from StackOverflow”

[–] zuhayr@lemmy.world 6 points 1 year ago (1 children)

“copy and paste from StackOverflow”

I feel violated

[–] kautau@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

Hey just remember the classic Quora answer:

https://www.quora.com/Why-should-I-hire-a-software-engineer-if-I-can-just-copy-and-paste-code-from-Stack-Overflow

They are paying $100,000. $1 to copy and paste code from stack overflow, and $99,999 to know where and when to paste the code and how to make it work.

Domain knowledge is real, and AI might level that up, but you’ll be hard pressed to find a junior engineer armed with the same tools as a senior engineer that gets dropped into a gig and can properly utilize AI or even StackOverflow to be on the same playing field. AI can write me a function. But to figure how broken a legacy codebase is and how that function can solve an issue is why engineers are still valuable…for now

[–] r00ty@kbin.life 3 points 1 year ago

I've heard this talk where I work. Senior plebs describing things that are obviously algorithms as AI. And this of course means we had AI before it was cool.

Nothing new here. Buzzwords are the only thing senior managers can understand.

[–] alphacyberranger@lemmy.world 31 points 1 year ago (1 children)

That's exactly the point. It's just how companies market their products nowadays.

[–] kittenbridgeasteroid@discuss.tchncs.de 8 points 1 year ago (2 children)

I mean, true AI isn't really a thing yet. People have been using AI wrong for a very long time now. Even ChatGPT isn't real AI.

[–] fidodo@lemm.ee 5 points 1 year ago (2 children)

Nobody can seem to consistently define what ai even means

[–] docAvid@midwest.social 6 points 1 year ago (2 children)

Inevitable. AI is Artificial Intelligence. Nobody can define intelligence, so how can they define an artificial variety?

[–] Streetdog@sh.itjust.works 2 points 1 year ago (1 children)

You can define intellegence by referring to all the very intelligent people online 🧠

[–] zuhayr@lemmy.world 2 points 1 year ago

Thank you for your service

[–] kameecoding@programming.dev 1 points 1 year ago (1 children)
[–] PipedLinkBot@feddit.rocks 1 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/ol2WP0hc0NY?t=33

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] jadero@programming.dev 5 points 1 year ago

It's a "gaps" problem.

Creationism has the "god of the gaps" where every new fossil forces them to set the goalposts closer together.

The people who think that human intelligence is something special have to adjust the spacing on the goalposts every time a corvid solves a new problem and every time someone figures out how to make a computer do something new.

[–] theycallmebeez@lemmy.ml 0 points 1 year ago (1 children)

ChatGPT is built upon a GPT language model, which is a type of Artificial Intelligence.

[–] Xylight@lemmy.xylight.dev 6 points 1 year ago* (last edited 1 year ago) (3 children)

(This isn't my opinion, just saying what I think they are)

They are saying it's not intelligent in any way though. It sees a bunch of words as numbers and spits out some new numbers that the prediction algorithm creates.

[–] LoafyLemon@kbin.social 17 points 1 year ago* (last edited 1 year ago) (1 children)

What you're thinking of as AI is actually a narrower version, while true intelligence is termed AGI.

Explanation:
The term 'AI' (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence, like recognizing patterns or making decisions. However, most AI systems are specialized and focused on specific tasks.

On the other hand, 'AGI' (Artificial General Intelligence) refers to a higher level of AI that possesses human-like cognitive abilities. AGI systems would be capable of understanding, learning, and applying knowledge across a wide range of tasks, much like us.

So, the distinction lies in the breadth of capabilities: AI refers to more specialized, task-focused systems, while AGI represents a more versatile and human-like intelligence.

[–] BlinkAndItsGone@lemm.ee 6 points 1 year ago* (last edited 1 year ago) (1 children)

The term ‘AI’ (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence,

That's everything computers do, though, isn't it? Pocket calculators would have fit this definition of AI in the 1970s. In the '60s, "computer" was a human job title.

[–] LoafyLemon@kbin.social 3 points 1 year ago (1 children)

Unless your pocket calculator can recognise patterns or make decisions, it doesn't fit the description.

[–] qwertyasdef@programming.dev 0 points 1 year ago

Really? I would argue that pocket calculators are AI

[–] jadero@programming.dev 3 points 1 year ago

Fair enough. What evidence have you got that it's any different than what humans do? Have you looked around? How many people can you point to that are not just regurgitating or iterating or recombining or rearranging or taking the next step?

As far as I can tell, much of what we call intelligent activity can be performed by computer software and the gaps get smaller every year.

[–] Yendor@sh.itjust.works 2 points 1 year ago (2 children)

That’s not how ChatGPT works.

GPT is an LLM that use RNN. An RNN (Recurrent neural network) is not an algorithm.

[–] BleatingZombie@lemmy.world 5 points 1 year ago

It's not artificially intelligent either

[–] kittenbridgeasteroid@discuss.tchncs.de 1 points 1 year ago (1 children)

Yes, but a neural network is just a collection of ML algorithms.

[–] Yendor@sh.itjust.works 0 points 1 year ago

Yea, but not really. The algorithms are available for free, but they don’t do anything useful by themselves. The RNN is built by training the neural net, which uses grading/classification of training data to increase or decrease millions of coefficients of a multi-layer filter. It’s the training data, the classification feedback and the processing power that actually creates the AI.

[–] manitcor@lemmy.intai.tech 12 points 1 year ago* (last edited 1 year ago)

the computer wrote the 2nd one on accident when someone asked it to bake a cake.

[–] mp3@lemmy.ca 4 points 1 year ago

Bullshit vs Bullshit²

[–] nothacking@discuss.tchncs.de 13 points 1 year ago (1 children)

Is this part of a Hi-Lo implementation for blackjack? (Also, ewww mixed types)

[–] obosob@feddit.uk 9 points 1 year ago* (last edited 1 year ago) (2 children)

Yeah, just use a char for card and test

if(card < '7') count++;
else count--;

Or something, don't mix types.

[–] nothacking@discuss.tchncs.de 1 points 1 year ago (1 children)

Well that won't work for 7, 8 or 9.

[–] obosob@feddit.uk 2 points 1 year ago

I didn't notice that 7,8,9 had no effect on the count. My bad.

[–] Hexarei@programming.dev 1 points 1 year ago (1 children)

The cards should just be numbers, and an enum should be used for display names

[–] obosob@feddit.uk 2 points 1 year ago

Chars are just numbers, but yeah, an enum would work fine too, sure. The only advantage with using a char for it is that there's no conversion needed for outputting them into strings so it's a little easier. Less code, very readable, etc. Though yeah, thinking about it JQKA wouldn't be numerically in the right order which could cause issues if the program did more than just implement HiLo

Is the litmus test for programmers. When they start referring to "AI" as a clearly defined concept, you know they're artlessly making shit up for a quick buck.

[–] danhab99@programming.dev 3 points 1 year ago (1 children)

I hope I'm not being stupid right now, but is that the actual algorithm for counting cards in blackjack?

[–] rho_@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Half of it. This gives you the running count. You need to also keep track of “number of decks in shoe” -“number of cards dealt since last shuffle”/52 to tell you how many decks are left in the shoe, then divide the running count by the number of decks left to give you a true count.

True count higher than 1? Start increasing your bet accordingly.

[–] coco@lemmy.world 2 points 1 year ago

Irony i love you !!