BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).
Subject
BERT (Bidirectional Encoder Representations from Transformers)
Predicate
introduced_in_paper
Object
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018)
Primary source · preprint · 2018-10-11
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding — arXiv (Devlin, Chang, Lee, Toutanova)