Linear discriminant analysis: : A detailed tutorial

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Abstract

Linear Discriminant Analysis (LDA) is a very common technique for dimensionality reduction problems as a pre-processing step for machine learning and pattern classification applications. At the same time, it is usually used as a black box, but (sometimes) not well understood. The aim of this paper is to build a solid intuition for what is LDA, and how LDA works, thus enabling readers of all levels be able to get a better understanding of the LDA and to know how to apply this technique in different applications. The paper first gave the basic definitions and steps of how LDA technique works supported with visual explanations of these steps. Moreover, the two methods of computing the LDA space, i.e. class-dependent and class-independent methods, were explained in details. Then, in a step-by-step approach, two numerical examples are demonstrated to show how the LDA space can be calculated in case of the class-dependent and class-independent methods. Furthermore, two of the most common LDA problems (i.e. Small Sample Size (SSS) and non-linearity problems) were highlighted and illustrated, and state-of-the-art solutions to these problems were investigated and explained. Finally, a number of experiments was conducted with different datasets to (1) investigate the effect of the eigenvectors that used in the LDA space on the robustness of the extracted feature for the classification accuracy, and (2) to show when the SSS problem occurs and how it can be addressed.

References

S. Balakrishnama and A. Ganapathiraju, Linear discriminant analysis-a brief tutorial, Institute for Signal and Information Processing (1998).

E. Barshan, A. Ghodsi, Z. Azimifar and M.Z. Jahromi, Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds, Pattern Recognition 44(7) (2011), 1357–1371.

G. Baudat and F. Anouar, Generalized discriminant analysis using a kernel approach, Neural Computation 12(10) (2000), 2385–2404.

P.N. Belhumeur, J.P. Hespanha and D. Kriegman, Eigenfaces vs. fisherfaces: Recognition using class specific linear projection, IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7) (1997), 711–720.

N.V. Boulgouris and Z.X. Chi, Gait recognition using Radon transform and linear discriminant analysis, IEEE Transactions on Image Processing 16(3) (2007), 731–740.