where are vectors and is a matrix (and denotes transposition). The Mahalanobis distance is used in multi-dimensional statistical analysis; in particular, for testing hypotheses and the classification of observations. It was introduced by P. Mahalanobis , who used the quantity
as a distance between two normal distributions with expectations and and common covariance matrix . The Mahalanobis distance between two samples (from distributions with identical covariance matrices), or between a sample and a distribution, is defined by replacing the corresponding theoretical moments by sampling moments. As an estimate of the Mahalanobis distance between two distributions one uses the Mahalanobis distance between the samples extracted from these distributions or, in the case  where a linear discriminant function is utilized — the statistic , where and are the frequencies of correct classification in the first and the second collection, respectively, and is the normal distribution function with expectation 0 and variance 1.
|||P. Mahalanobis, "On tests and measures of group divergence I. Theoretical formulae" J. and Proc. Asiat. Soc. of Bengal , 26 (1930) pp. 541–588|
|||P. Mahalanobis, "On the generalized distance in statistics" Proc. Nat. Inst. Sci. India (Calcutta) , 2 (1936) pp. 49–55|
|||T.W. Anderson, "Introduction to multivariate statistical analysis" , Wiley (1958)|
|||S.A. Aivazyan, Z.I. Bezhaeva, O.V. Staroverov, "Classifying multivariate observations" , Moscow (1974) (In Russian)|
|||A.I. Orlov, "On the comparison of algorithms for classifying by results observations of actual data" Dokl. Moskov. Obshch. Isp. Prirod. 1985, Otdel. Biol. (1987) pp. 79–82 (In Russian)|
Mahalanobis distance. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Mahalanobis_distance&oldid=17720