BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining.
Object
Lewis et al. 2019 — denoising sequence-to-sequence pretraining
Primary source · preprint · 2019-10-29
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension — arXiv (Lewis, Liu, Goyal, Ghazvininejad, Mohamed, Levy, Stoyanov, Zettlemoyer / Facebook AI)