@thegibson LLMs don't answer questions, they generate text that looks like what an answer to a question that looks like your question looks like. It's got enough text from the internet in it that something that looks like an answer is often the right answer because, well, your not as creative as you think and someone's probably had the same conversation you're having before and the LLM eavesdropped on it. Hallucinations are just when text that looks like an answer happens to not be right. It's not really a special case, it's just the other side of the same coin.