Out-of-Distribution Detection

Published:

Out-of-distribution (OOD) detection is the part of a system that tries to notice when an input is outside what the model learned from. Many models will still produce an answer on unfamiliar inputs, and they may sound confident even when they’re guessing. OOD detection adds a safety signal so the system can pause, warn, or route the case to a fallback instead of treating every input as normal.

This differs from confidence calibration and uncertainty quantification. Calibration checks whether confidence scores match real accuracy. Uncertainty quantification estimates how unsure the model is. OOD detection asks a different question: Does this input look like something the model has seen before? Systems often use a “typicality” score and a threshold to decide when to flag a case, then validate it on realistic shift scenarios.

Follow us on Facebook and LinkedIn to keep abreast of our latest news and articles