Calibration measures how well your confidence levels match reality. A perfectly calibrated person who says "I'm 80% sure" on 100 questions should get roughly 80 correct.

Most people are systematically miscalibrated. Overconfidence (confidence > accuracy) is the dominant pattern — studies show people who say they're "99% sure" are wrong 40% of the time on general knowledge questions (Fischhoff et al., 1977).

Calibration is trainable. Research in the Good Judgment Project showed that forecasters who received calibration training improved their accuracy by 14% compared to untrained peers. MindFrame's calibration challenges directly target this skill through repeated practice with immediate, precise feedback on your confidence error.