BMVC 2004, Kingston, 7th-9th Sept, 2004
Oriented Discriminant Analysis
F. De la Torre and T. Kanade (Carnegie Mellon University, USA)
Linear discriminant analysis (LDA) has been an active topic of research
during the last century. However, the existing algorithms have several limitations
when applied to visual data. LDA is only optimal for gaussian distributed
classes with equal covariance matrices and just classes-1 features
can be extracted. On the other hand, LDA does not scale well to high dimensional
data (over-fitting) and it does not necessarily minimize the classification
error. In this paper, we introduce Oriented Discriminant Analysis
(ODA), a LDA extension which can overcome these drawbacks. Three main
novelties are proposed:
* An optimal dimensionality reduction which maximizes the Kullback-
Liebler divergence between classes is proposed. This allows us to model
class covariances and to extract more than classes-1 features.
* Several covariance approximations are introduced to improve classifi-
cation in the small sample case.
* A linear time iterative majorization method is introduced in order to
find a local optimal solution.
Several synthetic and real experiments on face recognition are reported.