Tag
knowledge-distillation
2 verified claims carrying this tag. Each has 2+ primary sources and an HMAC-SHA256 signature.
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
245af747a3d21061 · 2 sources · 100% confidence
Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network.
f14acb906ba6c12f · 2 sources · 100% confidence