Module 3 · Strengths & limits 6 min read

Hallucinations: when AI sounds confident but is wrong

The most important AI failure mode, explained. Spot one yourself.

A weird-but-true thing

An LLM never says “I don’t know” the way a person does. It’s a next-word-guesser, and it will always produce more words.

So when it doesn’t actually know something, it will sometimes make it up. Smoothly. Confidently. With invented details. This is called a hallucination.

Why does this happen?

Remember: the model learned to predict what sounds right, not to check what is right. So:

  • It might invent a book title that doesn’t exist, because that title sounds plausible.
  • It might quote a fake scientist with a real-sounding name.
  • It might give you confident, totally wrong math.

The model isn’t lying. It doesn’t know it’s lying. It’s a sophisticated improviser.

How to spot a hallucination

Three signs to watch for:

  1. Specific facts that are easy to check: names, dates, numbers, quotes. Always verify.
  2. Suspicious confidence on a niche topic. If you ask about your tiny hometown’s mayor, the AI may just make up a name.
  3. It mixes real and fake. Hallucinations are sneakiest when the surrounding facts are correct.

Spot the hallucination

Below, an AI made three claims on a topic. Two are real. One is invented. Which one is the lie?

An AI made these three claims. One is hallucinated. Spot it.

Click the claim you think the AI made up.

Now try it on real AI

Ask the AI about something very specific that you can verify, your school, a small business in your town, an obscure book. See if it invents details.

(Sometimes it will say “I’m not sure.” That’s good. Sometimes it’ll confidently make it up. That’s the hallucination.)

Golden rule

Trust the AI for shape, not for fact. Use it to brainstorm, summarize, and explore. Double-check anything that matters, dates, names, sources, math.

Quick check

  1. 1. What is an 'AI hallucination'?
  2. 2. Why does an LLM hallucinate?
  3. 3. Best way to protect yourself from hallucinations?