SourceScore

Verified claim · AI-ML · 100% confidence

Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.

Last verified 2026-05-16 · Methodology veritas-v0.1 · f068236101568ad7

Structured fields

Subject
Mixture of Experts (MoE) revival
Predicate
popularized_in
Object
Shazeer et al. 2017 — outrageously large neural networks via sparse gating
Confidence
100%
Tags
moe · mixture-of-experts · shazeer · google · foundational · iclr · 2017 · introduced_in

Sources (2)

  1. [1] preprint · arXiv (Shazeer, Mirhoseini, Maziarz, Davis, Le, Hinton, Dean / Google Brain) · 2017-01-23

    Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
    The capacity of a neural network to absorb information is limited by its number of parameters. Conditional computation, where parts of the network are active on a per-example basis, has been proposed in theory as a way of dramatically increasing model capacity without a proportional increase in computation.
  2. [2] peer reviewed · ICLR 2017 · 2017-01-23

    Outrageously Large Neural Networks — ICLR 2017 OpenReview

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating. — SourceScore Claim f068236101568ad7 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f068236101568ad7.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/f068236101568ad7/" width="100%" height="360" frameborder="0" loading="lazy" title="Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/f068236101568ad7.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/f068236101568ad7.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/f068236101568ad7.json") envelope = r.json() print(envelope["claim"]["statement"]) # "Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_mixture_of_experts_moe_revival_fact() -> dict: """Fetch the verified SourceScore claim for Mixture of Experts (MoE) revival.""" r = httpx.get("https://sourcescore.org/api/v1/claims/f068236101568ad7.json") return r.json()