Jump to content

Sensitivity index

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Sjara (talk | contribs) at 18:11, 9 July 2006 (Creating the page. Brief description.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory.

d' measures the separation between the means of the signal and noise distributions in units of the standard deviation of the noise distribution. An estimate of d' can be found from measurements of the hit rate and false-alarm rate.

A higher d' indicates that the signal can be more readily detected.

See also

Reference