The Linear Discriminant Analysis (LDA) is to separate samples of distinct groups by transforming to a space which maximises the between-class separability while minimising the within-class variability.
The between-class scatter matrix Sb be defined as
The within-class scatter matrix Sw be defined as
where xi,j is an n-dimensional data point j from class pi, Ni is the number of training examples from class pi, and g is the total number of classes or groups.
The main objective of LDA is to find a projection matrix Plda that maximises the ratio of the determinant of Sb to the determinant of Sw,
The Fisher’s criterion is maximised when the projection matrix Plda is composed of the eigenvectors of
It’s illustrated in below example that how LDA projection increases the between-class variability while minimizing the within-class variability.
LDA seeks directions that are efficient for discriminating data whereas PCA seeks directions that are efficient for representing data. LDA can be applied in several applications such as speaker recognition, face recognition, bankruptcy prediction, marketing, biomedical studies.