Verified claim · AI-ML · 100% confidence
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.
Last verified 2026-05-16 · Methodology veritas-v0.1 · f068236101568ad7
Structured fields
- Subject
- Mixture of Experts (MoE) revival
- Predicate
popularized_in- Object
- Shazeer et al. 2017 — outrageously large neural networks via sparse gating
- Confidence
- 100%
- Tags
- moe · mixture-of-experts · shazeer · google · foundational · iclr · 2017 · introduced_in
Sources (2)
[1] preprint · arXiv (Shazeer, Mirhoseini, Maziarz, Davis, Le, Hinton, Dean / Google Brain) · 2017-01-23
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer“The capacity of a neural network to absorb information is limited by its number of parameters. Conditional computation, where parts of the network are active on a per-example basis, has been proposed in theory as a way of dramatically increasing model capacity without a proportional increase in computation.”
[2] peer reviewed · ICLR 2017 · 2017-01-23
Outrageously Large Neural Networks — ICLR 2017 OpenReview
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating. — SourceScore Claim f068236101568ad7 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f068236101568ad7.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/f068236101568ad7/" width="100%" height="360" frameborder="0" loading="lazy" title="Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).
2d6d7f61f1db6493 · 100% confidence · shares 5 tags (moe, foundational, shazeer…)
Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021).
3d9c14b9379038c9 · 100% confidence · shares 3 tags (moe, foundational, google)
Vision Transformer (ViT) introduced in paper: An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale (Dosovitskiy et al., 2020).
d3681b0981e0b700 · 100% confidence · shares 3 tags (foundational, google, iclr)
Reformer introduced in paper: Reformer: The Efficient Transformer (Kitaev, Kaiser, Levskaya, 2020).
76f7f00e79bc18c8 · 100% confidence · shares 3 tags (foundational, iclr, google)
AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017).
b6d51eba4fc7f918 · 100% confidence · shares 3 tags (foundational, 2017, iclr)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/f068236101568ad7.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/f068236101568ad7.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/f068236101568ad7.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_mixture_of_experts_moe_revival_fact() -> dict:
"""Fetch the verified SourceScore claim for Mixture of Experts (MoE) revival."""
r = httpx.get("https://sourcescore.org/api/v1/claims/f068236101568ad7.json")
return r.json()