SourceScore

Verified claim · AI-ML · 100% confidence

SGLang introduced in: Zheng et al. 2024 — efficient LLM serving with structured outputs.

Last verified 2026-05-16 · Methodology veritas-v0.1 · 4244c11611a72550

Structured fields

Subject
SGLang
Predicate
introduced_in
Object
Zheng et al. 2024 — efficient LLM serving with structured outputs
Confidence
100%
Tags
sglang · uc-berkeley · inference · structured-outputs · open-source · 2024 · introduced_in

Sources (2)

  1. [1] preprint · arXiv (Zheng, Yin, Xie, Huang, Yu, Liu, Lin, Cuenca, Zhao, Stoica / UC Berkeley) · 2023-12-12

    SGLang: Efficient Execution of Structured Language Model Programs
    We introduce SGLang, a framework for efficient programming and execution of structured language model programs. SGLang consists of a frontend language and a runtime. The frontend simplifies programming with primitives for generation and parallelism control. The runtime accelerates the execution with optimizations like RadixAttention for KV cache reuse.
  2. [2] github release · SGLang Project / UC Berkeley · 2024-01-01

    SGLang — official GitHub repository

Cite this claim

Ready-to-paste citation (Markdown / plain text):

SGLang introduced in: Zheng et al. 2024 — efficient LLM serving with structured outputs. — SourceScore Claim 4244c11611a72550 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/4244c11611a72550.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/4244c11611a72550/" width="100%" height="360" frameborder="0" loading="lazy" title="SGLang introduced in: Zheng et al. 2024 — efficient LLM serving with structured outputs."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/4244c11611a72550.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/4244c11611a72550.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "SGLang introduced in: Zheng et al. 2024 — efficient LLM serving with structured outputs."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/4244c11611a72550.json") envelope = r.json() print(envelope["claim"]["statement"]) # "SGLang introduced in: Zheng et al. 2024 — efficient LLM serving with structured outputs."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_sglang_fact() -> dict: """Fetch the verified SourceScore claim for SGLang.""" r = httpx.get("https://sourcescore.org/api/v1/claims/4244c11611a72550.json") return r.json()