149
GPT-4's details are leaked. (threadreaderapp.com)

cross-posted from: https://lemmy.intai.tech/post/72919

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

you are viewing a single comment's thread
view the rest of the comments

Is it just me, or are those links going to the wrong places?

[-] manitcor@lemmy.intai.tech 1 points 1 year ago

They are the right ones. Should be a tweet archive and a blog post

Well that's weird because the first takes me to a shitpost with a picture of cake, and the second a shitpost about sucking your dentist's fingers...

[-] manitcor@lemmy.intai.tech 3 points 1 year ago

ewwww lol

are you using an app or the web? the links should point to the intai instance which works fine for me but i don't know what various clients will do with those links

I'm using Connect, so that could explain it! Thanks. I'll see if I can figure it out because this is really interesting to me, but the dentist post is not! Haha!

load more comments (1 replies)
load more comments (1 replies)
this post was submitted on 11 Jul 2023
149 points (100.0% liked)

ChatGPT

8657 readers
50 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS