cross-posted from: https://lemmy.intai.tech/post/72919

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

  • unstable_confusion@lemmy.fmhy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I’m using Connect, so that could explain it! Thanks. I’ll see if I can figure it out because this is really interesting to me, but the dentist post is not! Haha!