[an error occurred while processing this directive]

Workshop:

the workshop is kindly supported by SPSS the workshop is kindly supported by StatSoft

" Statistical data mining between research and practice "

27./28. Februar 2004 in Hamburg

Angelika van der Linde
(Institute for Statistics, University of Bremen)



[ Main Page]   [ Program]   [ Speakers]   [ Participants]   [ Travel Information ]

Dimension Reduction with Linear Discriminant Functions

For two random vectors X and Y the mutual information is defined to be the Kullback-Leibler distance between the joint density of X and Y and the product of their marginal densities. Under the assumption of bi-affinity of the log-odds ratio function (characterizing the association between X and Y) the symmetrized mutual information can be represented as trace of the product of a parameter matrix and the covariance matrix of X and Y. The assumption of bi-affinity is immediately met by multivariate Normal and multinomial distributions but is also often used in (logistic) regression models . It is shown that eigendecompositions of the matrix underlying the symmetrized mutual information yield familiar multivariate techninques like canonical correlation analysis or Fisher's linear discriminant functions under special distributional assumptions. Such eigendecompositions are then suggested more generally as techniques of dimension reduction with generalized linear discriminant functions.

 Seitenanfang  Impressum 20. Feb. 2004, von Stefan Heitmann