SourceScore
SourceScore VERITAS · verified claim100% confidence

Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).

Subject
Sparsely-Gated Mixture-of-Experts (MoE)
Predicate
introduced_in_paper
Object
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)
Primary source · preprint · 2017-01-23
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer arXiv (Shazeer, Mirhoseini, Maziarz, Davis, Le, Hinton, Dean)
Last verified 2026-05-16 · 1 source · 2d6d7f61f1db6493View full claim →