The Linear Discriminant Analysis (LDA) is to separate samples of distinct groups by transforming to a space which maximises the between-class separability while minimising the within-class variability.

The between-class scatter matrix S_{b} be defined as

The within-class scatter matrix *S*_{w} be defined as

where *x*_{i,j} is an *n*-dimensional data point *j* from class *p*_{i}, *N*_{i} is the number of training examples from class *p*_{i}, and *g* is the total number of classes or groups.

The main objective of LDA is to find a projection matrix *P**lda* that maximises the ratio of the determinant of *S**b* to the determinant of *S**w*,

The Fisher’s criterion is maximised when the projection matrix *P*_{lda} is composed of the eigenvectors of

It’s illustrated in below example that how LDA projection increases the between-class variability while minimizing the within-class variability.

LDA seeks directions that are efficient for discriminating data whereas PCA seeks directions that are efficient for representing data. LDA can be applied in several applications such as speaker recognition, face recognition, bankruptcy prediction, marketing, biomedical studies.

### Like this:

Like Loading...

*Related*