Verified claim · AI-ML · 100% confidence
Reformer introduced in paper: Reformer: The Efficient Transformer (Kitaev, Kaiser, Levskaya, 2020).
Last verified 2026-05-16 · Methodology veritas-v0.1 · 76f7f00e79bc18c8
Structured fields
- Subject
- Reformer
- Predicate
introduced_in_paper- Object
- Reformer: The Efficient Transformer (Kitaev, Kaiser, Levskaya, 2020)
- Confidence
- 100%
- Tags
- reformer · efficient-transformer · lsh-attention · foundational · 2020 · iclr · google
Sources (2)
[1] preprint · arXiv (Kitaev, Kaiser, Levskaya) · 2020-01-13
Reformer: The Efficient Transformer“We introduce two techniques to improve the efficiency of Transformers. For one, we replace dot-product attention by one that uses locality-sensitive hashing, changing its complexity from O(L^2) to O(L log L), where L is the length of the sequence.”
[2] peer reviewed · OpenReview / ICLR · 2020-04-26
Reformer: The Efficient Transformer (ICLR 2020)
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Reformer introduced in paper: Reformer: The Efficient Transformer (Kitaev, Kaiser, Levskaya, 2020). — SourceScore Claim 76f7f00e79bc18c8 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/76f7f00e79bc18c8.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/76f7f00e79bc18c8/" width="100%" height="360" frameborder="0" loading="lazy" title="Reformer introduced in paper: Reformer: The Efficient Transformer (Kitaev, Kaiser, Levskaya, 2020)."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Vision Transformer (ViT) introduced in paper: An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale (Dosovitskiy et al., 2020).
d3681b0981e0b700 · 100% confidence · shares 4 tags (foundational, 2020, google…)
ELECTRA introduced in paper: ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators (Clark et al., 2020).
2f9c79357e9d4da9 · 100% confidence · shares 3 tags (foundational, 2020, google)
Retrieval-Augmented Generation (RAG) introduced in paper: Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks (Lewis et al., 2020).
d15057ced937a103 · 100% confidence · shares 2 tags (foundational, 2020)
BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).
4c1ee70007dc89c1 · 100% confidence · shares 2 tags (foundational, google)
T5 (Text-to-Text Transfer Transformer) introduced in paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (Raffel et al., 2019).
ef28341c3b308737 · 100% confidence · shares 2 tags (foundational, google)
Programmatic access
Fetch this claim with a signed envelope for verification:
curl https://sourcescore.org/api/v1/claims/76f7f00e79bc18c8.json