Feature transformation with class conditional decorrelation

Xu Yao Zhang, Kaizhu Huang, Cheng Lin Liu

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

The well-known feature transformation model of Fisher linear discriminant analysis (FDA) can be decomposed into an equivalent two-step approach: whitening followed by principal component analysis (PCA) in the whitened space. By proving that whitening is the optimal linear transformation to the Euclidean space in the sense of minimum log-determinant divergence, we propose a transformation model called class conditional decor relation (CCD). The objective of CCD is to diagonalize the covariance matrices of different classes simultaneously, which is efficiently optimized using a modified Jacobi method. CCD is effective to find the common principal components among multiple classes. After CCD, the variables become class conditionally uncorrelated, which will benefit the subsequent classification tasks. Combining CCD with the nearest class mean (NCM) classification model can significantly improve the classification accuracy. Experiments on 15 small-scale datasets and one large-scale dataset (with 3755 classes) demonstrate the scalability of CCD for different applications. We also discuss the potential applications of CCD for other problems such as Gaussian mixture models and classifier ensemble learning.

Original languageEnglish
Article number6729573
Pages (from-to)887-896
Number of pages10
JournalProceedings - IEEE International Conference on Data Mining, ICDM
DOIs
Publication statusPublished - 2013
Event13th IEEE International Conference on Data Mining, ICDM 2013 - Dallas, TX, United States
Duration: 7 Dec 201310 Dec 2013

Keywords

  • class conditional decorrelation
  • feature transformation
  • simultaneous diagonalization

Fingerprint

Dive into the research topics of 'Feature transformation with class conditional decorrelation'. Together they form a unique fingerprint.

Cite this