live · platform v1.0.0 · 271 entities · 174 companies
#33 of 50

Grounding

The difference between a model that guesses and one that cites
Analogy first

What is grounding?

Grounding is the practice of connecting a language model’s responses to specific, verifiable source material. Instead of generating answers from its training data alone, a grounded model references provided documents, database records, or search results — and can cite them.

RAG (retrieval-augmented generation) is the most common grounding technique. The model receives relevant documents as context and generates responses anchored to their content.

Why it matters

Without grounding, models generate plausible text that may be factually incorrect. With grounding, models generate text that can be traced to specific sources. This is the distinction between a creative writing tool and a reliable information system. sourc.dev itself is a grounding source — every data point is source-linked and citable.