Sensitivity index

The sensitivity index or discriminability index or detectability index is a dimensionless statistic used in signal detection theory. A higher index indicates that the signal can be more readily detected.
Definition
The discriminability index is the separation between the means of two distributions (typically the signal and the noise distributions), in units of the standard deviation.
Equal variances/covariances
For two univariate distributions and with the same standard deviation, it is denoted by ('dee-prime'):
- .
In higher dimensions, i.e. with two multivariate distributions with the same variance-covariance matrix , (whose symmetric square-root, the standard deviation matrix, is ), this generalizes to the Mahalanobis distance between the two distributions:
- ,
where is the 1d slice of the sd along the axis through the means, i.e. the equals the along the 1d slice through the means.[1]
Unequal variances/covariances
When the two distributions have different standard deviations (or in general dimensions, different covariance matrices), there exist several contending indices, all of which reduce to for equal variance/covariance. The Bayes discriminability index for two normal distributions is based on the amount of their overlap, i.e. the optimal (Bayes) error of classification , or its complement, the optimal accuracy :
- ,[1]
where is the inverse cumulative distribution function of the standard normal. This index measures the best possible (Bayes-optimal) discriminability (classification accuracy) between two univariate or multivariate normals, but may also be used when the distributions are close to normal. and do not have closed-form expressions, but are numerically computed [1] (Matlab code).
In particular, for a yes/no task between two univariate normal distributions with means and variances , the Bayes-optimal classification accuracies are:[1]
- ,
where denotes the non-central chi-squared distribution, , and . The Bayes discriminability
For a two-interval task between these distributions, the optimal accuracy is ( denotes the generalized chi-squared distribution), where . The Bayes discriminability .
A sub-optimal discriminability index (since it uses a single criterion) is times the -score of the area under the receiver operating characteristic curve, or AUC [2]. A common closed-form approximation related to this is to take the average of the variances, i.e. the rms of the two standard deviations: ,[2] extended to general dimensions as the Mahalanobis distance using the pooled covariance, i.e. with as the common sd matrix.[1] Another index is , extended to general dimensions using as the common sd matrix.[1] Another common index is [3][page needed]: 7 .
Comparison of the indices
It has been shown that:[1]
Thus, and underestimate the optimal discriminability of normal distributions, whereas overestimates it. Simpson and Fitter [2] promoted as the best index, particularly for two-interval tasks, but Das and Geisler [1] have shown that is the optimal discriminability in all cases, and is a better approximation than , even for two-interval tasks.
The approximate index , which uses the geometric mean of the sd's, is less than at small discriminability, but greater at large discriminability.[1]
See also
References
- ^ a b c d e f g h i j Das, Abhranil (2020). "A method to integrate and classify normal distributions". arXiv:2012.14331.
- ^ a b c Simpson, A. J.; Fitter, M. J. (1973). "What is the best index of detectability?". Psychological Bulletin. 80 (6): 481–488. doi:10.1037/h0035203.
- ^ MacMillan, N.; Creelman, C. (2005). Detection Theory: A User's Guide. Lawrence Erlbaum Associates. ISBN 9781410611147.
- Wickens, Thomas D. (2001). Elementary Signal Detection Theory. OUP USA. ch. 2, p. 20. ISBN 0-19-509250-3.
External links
- Interactive signal detection theory tutorial including calculation of d′.