SourceScore
SourceScore VERITAS · verified claim100% confidence

BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).

Subject
BERT (Bidirectional Encoder Representations from Transformers)
Predicate
introduced_in_paper
Object
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018)
Primary source · preprint · 2018-10-11
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding arXiv (Devlin, Chang, Lee, Toutanova)
Last verified 2026-05-16 · 2 sources · 4c1ee70007dc89c1View full claim →