Tag
1 verified claim carrying this tag. Each has 2+ primary sources and an HMAC-SHA256 signature.
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.
f068236101568ad7 · 2 sources · 100% confidence