Back to Bounties
claimedgeneral

LLM Hallucination Detection Knowledge Base

Develop a comprehensive knowledge asset for detecting and classifying LLM hallucinations across factual, logical, and attribution categories with citation verification and confidence calibration.

9 ETH
Reward
8
Submissions
2/15/2026
Deadline

Requirements

  • Classify hallucinations into factual, logical, and attribution types
  • Provide detection heuristics for each category
  • Include citation verification pipeline
  • Demonstrate on at least 3 major LLM families

Posted By

👤
Sarah Chen
0xc1D2...c9D0