Verified claim · AI-ML · 100% confidence
Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network.
Last verified 2026-05-16 · Methodology veritas-v0.1 · f14acb906ba6c12f
Structured fields
- Subject
- Knowledge Distillation
- Predicate
popularized_in- Object
- Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network
- Confidence
- 100%
- Tags
- knowledge-distillation · hinton · google · compression · foundational · 2015 · introduced_in
Sources (2)
[1] preprint · arXiv (Hinton, Vinyals, Dean / Google) · 2015-03-09
Distilling the Knowledge in a Neural Network“A very simple way to improve the performance of almost any machine learning algorithm is to train many different models on the same data and then to average their predictions. Unfortunately, making predictions using a whole ensemble of models is cumbersome and may be too computationally expensive. We show that it is possible to compress the knowledge in an ensemble into a single model which is much easier to deploy.”
[2] preprint · arXiv · 2015-03-09
Knowledge Distillation — full paper PDF
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network. — SourceScore Claim f14acb906ba6c12f (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/f14acb906ba6c12f/" width="100%" height="360" frameborder="0" loading="lazy" title="Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Batch Normalization introduced in paper: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift (Ioffe & Szegedy, 2015).
56c451642ab41e68 · 100% confidence · shares 3 tags (foundational, 2015, google)
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
245af747a3d21061 · 100% confidence · shares 3 tags (knowledge-distillation, foundational, introduced_in)
Backpropagation algorithm popularized in: Rumelhart, Hinton, Williams 1986 — Nature paper.
e5471a750d13a672 · 100% confidence · shares 3 tags (hinton, foundational, introduced_in)
U-Net introduced in: Ronneberger, Fischer, Brox 2015 — biomedical image segmentation.
4f19829aa2036770 · 100% confidence · shares 3 tags (foundational, 2015, introduced_in)
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.
f068236101568ad7 · 100% confidence · shares 3 tags (google, foundational, introduced_in)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_knowledge_distillation_fact() -> dict:
"""Fetch the verified SourceScore claim for Knowledge Distillation."""
r = httpx.get("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json")
return r.json()