SourceScore

Verified claim · AI-ML · 100% confidence

BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining.

Last verified 2026-05-16 · Methodology veritas-v0.1 · f5b422e3255fd7c0

Structured fields

Subject
BART
Predicate
introduced_in
Object
Lewis et al. 2019 — denoising sequence-to-sequence pretraining
Confidence
100%
Tags
bart · facebook-ai · denoising · seq2seq · encoder-decoder · foundational · 2019 · introduced_in

Sources (2)

  1. [1] preprint · arXiv (Lewis, Liu, Goyal, Ghazvininejad, Mohamed, Levy, Stoyanov, Zettlemoyer / Facebook AI) · 2019-10-29

    BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
    We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes.
  2. [2] official blog · Hugging Face · 2019-10-29

    BART — Hugging Face Transformers documentation

Cite this claim

Ready-to-paste citation (Markdown / plain text):

BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining. — SourceScore Claim f5b422e3255fd7c0 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f5b422e3255fd7c0.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/f5b422e3255fd7c0/" width="100%" height="360" frameborder="0" loading="lazy" title="BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/f5b422e3255fd7c0.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/f5b422e3255fd7c0.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/f5b422e3255fd7c0.json") envelope = r.json() print(envelope["claim"]["statement"]) # "BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_bart_fact() -> dict: """Fetch the verified SourceScore claim for BART.""" r = httpx.get("https://sourcescore.org/api/v1/claims/f5b422e3255fd7c0.json") return r.json()