Linear Discriminant Analysis (LDA)

The Linear Discriminant Analysis (LDA) is to separate samples of distinct groups by transforming to a space which maximises the between-class separability while minimising the within-class variability.

The between-class scatter matrix Sb be defined as
Screenshot (25)

The within-class scatter matrix Sw be defined as
Screenshot (26)

where xi,j is an n-dimensional data point j from class pi, Ni is the number of training examples from class pi, and g is the total number of classes or groups.

The main objective of LDA is to find a projection matrix Plda that maximises the ratio of the determinant of Sb to the determinant of Sw,
Screenshot (27)

The Fisher’s criterion is maximised when the projection matrix Plda is composed of the eigenvectors of
Screenshot (28)
It’s illustrated in below example that how LDA projection increases the between-class variability while minimizing the within-class variability.
LDA seeks directions that are efficient for discriminating data whereas PCA seeks directions that are efficient for representing data. LDA can be applied in several applications such as speaker recognition, face recognition, bankruptcy prediction, marketing, biomedical studies.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s