SourceScore

Verified claim · AI-ML · 100% confidence

Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).

Last verified 2026-05-16 · Methodology veritas-v0.1 · 2d6d7f61f1db6493

Structured fields

Subject
Sparsely-Gated Mixture-of-Experts (MoE)
Predicate
introduced_in_paper
Object
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)
Confidence
100%
Tags
moe · foundational · shazeer · 2017 · google

Sources (1)

  1. [1] preprint · arXiv (Shazeer, Mirhoseini, Maziarz, Davis, Le, Hinton, Dean) · 2017-01-23

    Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
    We introduce a Sparsely-Gated Mixture-of-Experts layer (MoE), consisting of up to thousands of feed-forward sub-networks. A trainable gating network determines a sparse combination of these experts to use for each example.

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017). — SourceScore Claim 2d6d7f61f1db6493 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/2d6d7f61f1db6493/" width="100%" height="360" frameborder="0" loading="lazy" title="Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Programmatic access

Fetch this claim with a signed envelope for verification:

curl https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json

API docs · Pricing · Methodology JSON