SourceScore

Verified claim · AI-ML · 100% confidence

llama.cpp publicly released on: 2023-03-10 by Georgi Gerganov.

Last verified 2026-05-16 · Methodology veritas-v0.1 · 2c6ddc094019890c

Structured fields

Subject
llama.cpp
Predicate
publicly_released_on
Object
2023-03-10 by Georgi Gerganov
Confidence
100%
Tags
llama-cpp · ggerganov · inference · open-source · cpp · released_on · 2023

Sources (2)

  1. [1] github release · Georgi Gerganov / open-source community · 2023-03-10

    llama.cpp — Inference of Meta's LLaMA model in pure C/C++
    Inference of Meta's LLaMA model (and others) in pure C/C++. The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide range of hardware - locally and in the cloud.
  2. [2] github release · Georgi Gerganov · 2023-03-10

    llama.cpp first tagged release

Cite this claim

Ready-to-paste citation (Markdown / plain text):

llama.cpp publicly released on: 2023-03-10 by Georgi Gerganov. — SourceScore Claim 2c6ddc094019890c (verified 2026-05-16). https://sourcescore.org/api/v1/claims/2c6ddc094019890c.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/2c6ddc094019890c/" width="100%" height="360" frameborder="0" loading="lazy" title="llama.cpp publicly released on: 2023-03-10 by Georgi Gerganov."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/2c6ddc094019890c.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/2c6ddc094019890c.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "llama.cpp publicly released on: 2023-03-10 by Georgi Gerganov."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/2c6ddc094019890c.json") envelope = r.json() print(envelope["claim"]["statement"]) # "llama.cpp publicly released on: 2023-03-10 by Georgi Gerganov."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_llama_cpp_fact() -> dict: """Fetch the verified SourceScore claim for llama.cpp.""" r = httpx.get("https://sourcescore.org/api/v1/claims/2c6ddc094019890c.json") return r.json()