bnaur

joined 1 year ago
[–] bnaur@lemmy.world 2 points 8 months ago

It's a russian Margolin, or some variant. So yes, a .22LR.

[–] bnaur@lemmy.world 2 points 9 months ago* (last edited 9 months ago)

Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it's going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.

[–] bnaur@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

Practical, but not really equivalent though because of nil punning.

[–] bnaur@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

The locking down started with the original MacIntosh (or actually with the Lisa I guess). ISTR they had at least one bit more open period after that, but those have always been the exception.

[–] bnaur@lemmy.world 2 points 9 months ago* (last edited 9 months ago) (1 children)

Wouldn't it be more correct to say that most Americans also use a messaging app (iMessage). The rest are just stuck with SMS to have compatibility with the iPhone users.

As the iPhone was (is?) not as popular in the Europe as it was (is) in the States that might also be one of the reasons why people here ditched SMS so fast once smartphones got popular.

[–] bnaur@lemmy.world 5 points 1 year ago

Join and recommend smaller general instances like lemm.ee, vlemmy.net, and lemmy.one at random instead. Smaller servers have been upgraded for the surge of users too you know

That was basically my logic when I joined lemmy.world a few weeks ago. Oh well...

[–] bnaur@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

For years now I have only read ebooks on my phone, so one evening I decided to get back to the habit of reading real books.

So I take my time and carefully pick just the right book, gather some pillows, turn off the lights and lay comfortable on the couch. And after a few confused moments of flipping through pages I realized that these fucking things didn't work in the dark. And I really don't like to read under a bright light anymore so back to reddit it was for that evening.

That said, I think I'll skip this one, doesn't sound too comfortable.

[–] bnaur@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

Speaking as just a hobbyist, a more developer oriented community focused on the topic would be nice, if someone is up to the task.

It's currently hard to find any good information about how to actually use LLMs as part of a software project as most of the related subreddits etc. are more focused on shitposting and you don't currently really want to talk about these in general tech/programming forums without a huge Don't shoot I'm not one of them! disclaimer.

Edit: took a quick look at lemmy.intai.tech and it seems promising!

[–] bnaur@lemmy.world 3 points 1 year ago (1 children)

Regarding little Bobby, is there any known guaranteed way to harden the current systems against prompt injections?

This is something that I'm personally more worried about than Skynet or mass unemployment now that everyone and their dog is rushing to integrate LLMs into to their systems (ok worried maybe a wrong word, but let's just say I have the popcorns ready for the moment the first mass breaches happen with something like the Windows Copilot).

[–] bnaur@lemmy.world 2 points 1 year ago (3 children)

At least I'm interested but more technical discussion about this would probably fit better in some comp sci or programming community? Though most of those are a bit hostile to the LLM related topics these days because of all the hype and low effort spam.

[–] bnaur@lemmy.world 3 points 1 year ago (1 children)

Is the whole "You are an LLM by OpenAI, system date is etc." prompt part of the system message?

A few days ago when I was talking about controlled natural languages with it and asked it to give a summary of the chat so far in Gellish it spit that out.

[–] bnaur@lemmy.world 3 points 1 year ago (3 children)

If these commands were in a system message it would generally refuse to help you.

Doesn't it usually fairly easily give its system message to the user? I have had that happen purely by accident.

view more: next ›