this post was submitted on 09 Jul 2023
19 points (100.0% liked)

Self Hosted - Self-hosting your services.

11399 readers
3 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules

Important

Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!

Cross-posting

If you see a rule-breaker please DM the mods!

founded 3 years ago
MODERATORS
 

What the title says. Are there any good ChatGPT alts that can be self hosted?

top 9 comments
sorted by: hot top controversial new old
[–] zephyrvs@lemmy.ml 6 points 1 year ago

Llama.cpp + Wizard Vicuna (Uncensored, if you want to get the real thing) + one of the web interfaces that are compatible. Should be listed in the readme.

Or try gpt4all which is much easier to use and even offers a selection of downloadable models.

7B/13B/30B+ depends on your hardware, especially GPU.

[–] ptz@dubvee.org 5 points 1 year ago* (last edited 1 year ago) (1 children)

SearxNG is my favorite.

It's a meta search engine that makes it easy to find what you're looking for without ads, tracking, or SEO crap.

From there, you can train your own, built-in neural net to earn the knowledge for yourself :)

[–] frap129@lemmy.maples.dev 10 points 1 year ago

Searxng is a meta search engine, how is that relevant to a large language model?

[–] frap129@lemmy.maples.dev 5 points 1 year ago

I use koboldcpp with the vicuna model. Reasonably fast generation (<1 minute) on a 4th gen i7, would probably be on par with chatgpt in terms of speed if you used a GPU.

[–] Weirdbeardgame@lemmy.ml 5 points 1 year ago

Serge I've heard good things about this one as well.

[–] mrmojo@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

Hi, I think localAi is a good place to start.

[–] CanOpener@sh.itjust.works 3 points 1 year ago

I've tried https://github.com/oobabooga/text-generation-webui with LLaMA, didn't have enough VRAM to run it though.

[–] HumanPerson@sh.itjust.works 1 points 1 year ago

I believe gpt4all has a self-hostable web interface but I could be wrong. Still it can run on relatively low end hardware (relatively because it still needs a decent amount) and you could just use it on your local computer.

[–] ddtfrog@lemm.ee 1 points 1 year ago

Codestar for dev

load more comments
view more: next ›