Motius Glossary

The Motius Literary Index provides an alphabetical overview of the terms we use on our website.

H

Hallucination in LLMs

When a language model generates text that sounds plausible but is factually incorrect or entirely made up. Common in generative AI. Mitigation strategies include grounding, retrieval-based methods, and fine-tuning with human feedback.