this post was submitted on 27 Jun 2024
6 points (87.5% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Agree with others, this guide is a bit more work than you probably need. I don't really run windows much anymore but I did have an easier time with WSL like the other poster mentioned.
And just to check, are you planning on fine-tuning a model? If so then the whole anaconda / miniconda, pytorch, etc... path makes sense.
But if you're not fine-tuning and you just want to run a model locally, I'd suggest ollama. If you want a UI on top of it, open-webui is great.
Nah I'm just wanting to run for now, maybe If I get more interested down the Line, but I will check those out