this post was submitted on 01 Sep 2023
22 points (100.0% liked)
Free and Open Source Software
17934 readers
15 users here now
If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AMD has ROCm which is available on And Radeon Instinct GPUs (server GPUs) and some consumer GPUs. You'd need to double check whether your GPU supports ROCm.
It seems there is some discussion happening here on the use of ROCm with Whisper: https://github.com/openai/whisper/discussions/105 And here (showing it might be possible?): https://github.com/openai/whisper/discussions/55
I also found this which could be of interest:
MLC-LLM, which "Enable everyone to develop, optimize and deploy AI models natively on everyone's devices."
Here used to deploy Llama-2-13B on the RX 7900 XTX:
https://blog.mlc.ai/2023/08/09/Making-AMD-GPUs-competitive-for-LLM-inference?ref=upstract.com
Thanks for that, I've been able to get Stable Diffusion running locally with ROCm so it looks like it should be possible then.