this post was submitted on 11 May 2025
435 points (97.6% liked)

Funny: Home of the Haha

6997 readers
420 users here now

Welcome to /c/funny, a place for all your humorous and amusing content.

Looking for mods! Send an application to Stamets!

Our Rules:

  1. Keep it civil. We're all people here. Be respectful to one another.

  2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry. I should not need to explain this one.

  3. Try not to repost anything posted within the past month. Beyond that, go for it. Not everyone is on every site all the time.


Other Communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] rickyrigatoni@lemm.ee 18 points 2 days ago (4 children)

Is it even actually possible for a llm to execute commands?

[–] Evotech@lemmy.world 15 points 2 days ago

Sure, if you give it that permission

A online llm chatbot? No

[–] dr_robotBones@reddthat.com 12 points 2 days ago (1 children)

I don't think so lol. My local LLM can't atleast.

[–] rickyrigatoni@lemm.ee 10 points 2 days ago

I look on the cyberweb and it looks like they can but you need to install things specifically for that purpose. So hopefully nobody at the ai companies did that to the public interface. Or hopefully they did depending on your purposes.

[–] markovs_gun@lemmy.world 8 points 2 days ago

Not terminal commands like this, but some have the ability to write and execute Python code to solve math problems more effectively than just the LLM. I'm sure that could be abused but not like this.

[–] synae@lemmy.sdf.org 3 points 2 days ago

It doesn't understand what that means so I'm gonna say no.