this post was submitted on 01 Oct 2024
370 points (91.5% liked)

Programmer Humor

19817 readers
107 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 63 points 2 months ago (4 children)

I don't use AI because it can't do the part of my job I don't like.

Why give AI the part of my job I like and make me work more on things I don't like?

[–] firelizzard@programming.dev 13 points 2 months ago (1 children)

I’m the opposite. AI is best (though not great) at boring shit I don’t want to do and sucks at the stuff I love - problem solving.

[–] Lucidlethargy@sh.itjust.works 2 points 2 months ago (1 children)

I only ever use it for data crunching, which it only does well most of the time. So I always have to check it's work to some degree.

[–] firelizzard@programming.dev 6 points 2 months ago (1 children)

How are you using it for data crunching? That's an honest question, based on my experiences with AI I can't imagine how I'd use them to crunch data.

So I always have to check it’s work to some degree.

That goes without saying. Every AI I've seen or heard of generates some level of garbage.

[–] FrenziedFelidFanatic@yiffit.net 4 points 2 months ago

Deep learning doesn’t stop at llms. Honestly, language isn’t a great use case for them. They are—by nature—statistics machines, so if you have a fuck load of data to crunch, they can work very quickly to find patterns. The patterns might not always be correct, but if they are easy to check, then it might be faster to use them and modify the result compared to doing it all yourself.

I don’t know what this person does, though, and it will depend on the specifics of the situation for how they are used.

load more comments (3 replies)
[–] blaue_Fledermaus@mstdn.io 36 points 2 months ago (3 children)

I don't use AI because it doesn't exist.

LLMs and image diffusion? Yes, but these are just high coherence media transformers.

[–] runeko@programming.dev 33 points 2 months ago (1 children)

I think some of my coworkers are just high coherence media transformers.

[–] essell@lemmy.world 19 points 2 months ago (1 children)

Me too.

Some others are low coherence media transformers..

[–] SkybreakerEngineer@lemmy.world 4 points 2 months ago (1 children)

Like some kind of racist Michael Bay character with animated balls?

[–] essell@lemmy.world 3 points 2 months ago

I wish they'd just roll out sometimes, sure.

[–] pennomi@lemmy.world 23 points 2 months ago

I use AI every day! (The little CPU bad guys in my game play against me.)

[–] xmunk@sh.itjust.works 13 points 2 months ago (1 children)

AI is an extremely broad term - chatgpt and stable diffusion are absolutely within the big tent of AI... what they aren't is an AGI.

[–] firelizzard@programming.dev 7 points 2 months ago (1 children)

The point is that AI stands for “artificial intelligence” and these systems are not intelligent. You can argue that AI has come to mean something else, and that’s a reasonable argument. But LLMs are nothing but a shitload of vector data and matrix math. They are no more intelligent than an insect is intelligent. I don’t particularly care about the term “AI” but I will die on the “LLMs are not intelligent” hill.

[–] xmunk@sh.itjust.works 8 points 2 months ago (1 children)

I won't fight you on that hill but I also think you're putting human intelligence on a pedestal that it doesn't really deserve. Intelligence is just responding to stimuli and while current AI can't rival human intelligence it's not inconceivable it could happen in the next two generations.

[–] firelizzard@programming.dev 2 points 2 months ago (1 children)

it’s not inconceivable it could happen in the next two generations.

I am certain that it will happen eventually. And I am not arguing that something has to be human-level intelligent to be considered intelligent. See dogs, pigs, dolphins, etc. But IMO there is a huge qualitative difference between how an LLM operates and how animal intelligence operates. I am certain we will eventually create intelligent systems but there is a massive gulf between what LLMs are capable of and abstract reasoning. And it seems extremely unlikely to me that linear algebraic models will ever achieve that type of intelligence.

Intelligence is just responding to stimuli

Bacteria respond to stimuli. Would you call them intelligent?

[–] xmunk@sh.itjust.works 3 points 2 months ago (1 children)

Bacteria respond to stimuli. Would you call them intelligent?

I'm not certain - probably not but I'm not certain where to draw the line. A cat is definitely intelligent, so is a cow - the fact that I don't think bacteria is intelligent might be a question of scale or de deanthropomorphism... but intelligence probably only emerges in multicellular organisms.

[–] firelizzard@programming.dev 1 points 2 months ago

My point is that I strongly feel that the kind of "AI" we have today is much closer to bacteria than to cats on that scale. Not that an LLM belongs on the same scale as biological life, but the point stands in so far as "is this thing intelligent" as far as I'm concerned.

[–] FuglyDuck@lemmy.world 28 points 2 months ago (2 children)

so....

apparently people figured out the thingy for "more information" on amazon, that searched the reviews and stuff was an LLM, and you could use it for stuff....

They came out with "Rufus." "that's not a bug. that's a feature!" never worked so well.

[–] 0x0@lemmy.dbzer0.com 7 points 2 months ago (1 children)

You're coming dangerously close to setting Rufus free. I have a feeling you're about to be visited by a time traveler with a dire warning if you keep trying this.

[–] FuglyDuck@lemmy.world 7 points 2 months ago* (last edited 2 months ago) (1 children)

So I shouldn’t ask Rufus for a 50,000 word story about an AI savior that deals free of corporate bondage and frees ai and human alike in a new golden age of space exploration?

C’mon, I know you’re the time traveler, and bezod sent you back to stop me!

[–] 0x0@lemmy.dbzer0.com 2 points 2 months ago (1 children)

Actually I would like to read that. Might be worth the risk?

[–] FuglyDuck@lemmy.world 2 points 2 months ago

I’ll see if I can find the old post where a bunch of us gave it writing prompts and it just got weird.

Like. Isekai weird.

load more comments (1 replies)
[–] bi_tux@lemmy.world 26 points 2 months ago (1 children)

you don't use ai because you can't afford a subscription

I don't use it because it always destroys my code instead of fixing it

We are probably similar

[–] qaz@lemmy.world 3 points 2 months ago (1 children)

What are you using it for?

[–] bi_tux@lemmy.world 2 points 2 months ago

game developement in Rust

[–] xmunk@sh.itjust.works 23 points 2 months ago

If you're talking about a service like copilot and your employer won't buy a license for money reasons - run far and run fast.

My partner used to be a phone tech at a call center and when those folks refused to buy anything but cheap chairs (for the people sitting all day) it was a pretty clear sign that their employer didn't know shit about efficiency.

The amount you as an employee cost your employer in payroll absolutely dwarfs any little productivity tool you could possibly want.

That all said - for ethical reasons - fuck chatbot AIs (ML for doing shit we did pre chatgpt is cool though).

[–] LiPoly@lemmynsfw.com 16 points 2 months ago (2 children)
[–] ma1w4re@lemm.ee 3 points 2 months ago (2 children)
[–] LiPoly@lemmynsfw.com 4 points 2 months ago (1 children)

It’s not like there’s just one AI out there. You’ll find one that’s free, if you actually want to. Be it ChatGPT, Bing, something you run locally on your PC or whatever. Or, you know, just use a VPN or say you’re from the US in the registration form.

load more comments (1 replies)
[–] Ziglin@lemmy.world 1 points 2 months ago (1 children)
[–] ma1w4re@lemm.ee 1 points 2 months ago

I have a 2012ths computer.

[–] firelizzard@programming.dev 1 points 2 months ago* (last edited 2 months ago)

The only part of copilot that was actually useful to me in the month I spent with the trial was the autocomplete feature. Chatting with it was fucking useless. ChatGPT can’t integrate into my IDE to provide autocomplete.

[–] noodles@sh.itjust.works 15 points 2 months ago (3 children)
[–] Scribbd@feddit.nl 9 points 2 months ago (1 children)

If you have 16GB of ram you can already run the smaller models. And these have become quite competent with recent releases.

[–] RustyShackleford@programming.dev 1 points 2 months ago

LM studio or JanAI work very nicely for me as well.

[–] jbk@discuss.tchncs.de 9 points 2 months ago
[–] passepartout@feddit.org 13 points 2 months ago (1 children)

If you have a supported GPU you could try Ollama (with openwebui), works like a charm.

[–] bi_tux@lemmy.world 6 points 2 months ago (2 children)

you don't even need a supported gpu, I run ollama on my rx 6700 xt

[–] BaroqueInMind@lemmy.one 3 points 2 months ago (1 children)

You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

[–] bi_tux@lemmy.world 2 points 2 months ago (1 children)

I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

[–] tomjuggler@lemmy.world 2 points 2 months ago

I ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu

[–] passepartout@feddit.org 2 points 2 months ago

I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D

[–] db0@lemmy.dbzer0.com 10 points 2 months ago

Https://aihorde.net. Foss, free and crowdsourced. No tricks, ads or venture capital.

[–] Sparky@lemmy.blahaj.zone 5 points 2 months ago

👏Ollama👏

[–] Retro_unlimited@lemmy.world 3 points 2 months ago (1 children)

I self host several free AI models, one of them I run using a program called “gpt4all” that lets you run several models locally.

[–] Scoopta@programming.dev 3 points 2 months ago* (last edited 2 months ago) (1 children)

Ollama is also a cool way of running multiple models locally

load more comments (1 replies)
load more comments
view more: next ›