catty

joined 3 days ago
[–] catty@lemmy.world -1 points 5 hours ago

I like how all the big media just happened to publish articles about "peaceful protests" to pacify the protestors, "please don't be violent everyone, the police can't shoot you all in the leg with rubber bullets".

The violent protests are the successful ones. Just ask every country, ever.

[–] catty@lemmy.world 0 points 5 hours ago

Damn, I should have ended the post with /s for people like you.

[–] catty@lemmy.world 0 points 5 hours ago

See here's the thing. Why would anyone want to host ALL the stuff on one pi? That is not what they were designed for. Ollama on a pi? Are you out of your mind? I'd run the biggest model I can on a modern gpu not some crappy old computer or pi....Right tool, right job. And why is dropping containers "less secure"? Do you mean "less cool"? Less easy to deploy? But you're not deploying it, you're installing it. You sound like a complete newb which is fine, but just take a step back from things and get some more experience. A pi is a tool for a purpose, not the end all. Using an old laptop is not going to save the world and arguing that it's just better than a pi (or similar alternative) is just dumb. Use a laptop for all I care, I'm not the boss of you.

As for an arr stack, I'm really disappointed with the software and don't use it and those who do have way too much time to set it up, and then make use of it!

[–] catty@lemmy.world 0 points 9 hours ago (2 children)

I can self host what I want on a pi zero. But, I do have some 30 years of experience so can probably do things some won't understand / bother with.

[–] catty@lemmy.world 7 points 10 hours ago (3 children)

I'm sure silicon valley are stepping on each other, vying to get their hands on these super cheap laptops for their 24/7 AI training.

[–] catty@lemmy.world 1 points 10 hours ago* (last edited 10 hours ago)

It's even worth pointing out you can disable various parts of the pi so it uses / needs even less juice.

[–] catty@lemmy.world 1 points 10 hours ago (1 children)

Pi’s are ARM-based, which still to this day limits the scope of their applicability.

Untrue.

Also, you should absolutely inspect a laptop before buying. Many, if not most, of old laptops will run just fine for the next few years.

Until the battery needs replacing, costing more than a pi, one key on the keyboard dies, etc.

[–] catty@lemmy.world 0 points 10 hours ago (4 children)

Please be specific rather than referring to 'raspberry pis' together. Different models have way different characteristics.

[–] catty@lemmy.world 2 points 10 hours ago (1 children)

This is generally not true. A small server running on an old pi when idling will have hardly any draw. It will cost literally pennies to run for the whole year.

[–] catty@lemmy.world 1 points 11 hours ago (1 children)

But... that's so uncool...

 

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

 

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›