this post was submitted on 28 Feb 2024
25 points (96.3% liked)

Stable Diffusion

4322 readers
17 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RootBeerGuy@discuss.tchncs.de 5 points 9 months ago (3 children)

Is that feasible on a Raspberry pi?

[–] Scew@lemmy.world 3 points 9 months ago* (last edited 9 months ago) (2 children)

No, lol. Well, at least I'm not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they "distilled" theirs from... Not really sure what exactly they think they are differentiating themselves from, reading the article...

[–] Even_Adder@lemmy.dbzer0.com 3 points 9 months ago

There are three models and the smallest one is 700M parameters.

[–] grue@lemmy.world 3 points 9 months ago (1 children)

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Jeff Geerling has entered the chat

[–] PipedLinkBot@feddit.rocks 2 points 9 months ago

Here is an alternative Piped link(s):

Jeff Geerling has entered the chat

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Even_Adder@lemmy.dbzer0.com 3 points 9 months ago

Probably. FastSD CPU already runs on a Raspberry PI 4.

[–] Wooki@lemmy.world 0 points 8 months ago* (last edited 8 months ago)

Lol read the article, it cites “8gb vram” and if i had to guess it will only support nvidia out of the gate