SourceScore

Verified claim · AI-ML · 100% confidence

Mixtral 8x7B architecture: Sparse Mixture-of-Experts (8 experts × 7B params, 2 experts routed per token).

Last verified 2026-05-16 · Methodology veritas-v0.1 · ad79b14fafb362cd

Structured fields

Subject
Mixtral 8x7B
Predicate
architecture
Object
Sparse Mixture-of-Experts (8 experts × 7B params, 2 experts routed per token)
Confidence
100%
Tags
mixtral · moe · architecture · mistral

Sources (2)

  1. [1] official blog · Mistral AI · 2023-12-11

    Mixtral of experts
    Mixtral has 8 experts in each layer … At every layer, for every token, a router network chooses two of these experts to process the token and combine their output additively.
  2. [2] preprint · Mistral AI / arXiv · 2024-01-08

    Mixtral of Experts

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Mixtral 8x7B architecture: Sparse Mixture-of-Experts (8 experts × 7B params, 2 experts routed per token). — SourceScore Claim ad79b14fafb362cd (verified 2026-05-16). https://sourcescore.org/api/v1/claims/ad79b14fafb362cd.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/ad79b14fafb362cd/" width="100%" height="360" frameborder="0" loading="lazy" title="Mixtral 8x7B architecture: Sparse Mixture-of-Experts (8 experts × 7B params, 2 experts routed per token)."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Programmatic access

Fetch this claim with a signed envelope for verification:

curl https://sourcescore.org/api/v1/claims/ad79b14fafb362cd.json

API docs · Pricing · Methodology JSON