TY - JOUR
T1 - Maxi-Min discriminant analysis via online learning
AU - Xu, Bo
AU - Huang, Kaizhu
AU - Liu, Cheng Lin
N1 - Funding Information:
This work was supported by National Basic Research Program of China (973 Program) Grants 2012CB316301 and 2012CB316302 , National Natural Science Foundation of China (NSFC) Grants No. 61075052 , No. 60825301 , the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant XDA06030300 ) and Tsinghua National Laboratory for Information Science and Technology (TNList) Cross-discipline Foundation .
PY - 2012/10
Y1 - 2012/10
N2 - Linear Discriminant Analysis (LDA) is an important dimensionality reduction algorithm, but its performance is usually limited on multi-class data. Such limitation is incurred by the fact that LDA actually maximizes the average divergence among classes, whereby similar classes with smaller divergence tend to be merged in the subspace. To address this problem, we propose a novel dimensionality reduction method called Maxi-Min Discriminant Analysis (MMDA). In contrast to the traditional LDA, MMDA attempts to find a low-dimensional subspace by maximizing the minimal (worst-case) divergence among classes. This "minimal" setting overcomes the problem of LDA that tends to merge similar classes with smaller divergence when used for multi-class data. We formulate MMDA as a convex problem and further as a large-margin learning problem. One key contribution is that we design an efficient online learning algorithm to solve the involved problem, making the proposed method applicable to large scale data. Experimental results on various datasets demonstrate the efficiency and the efficacy of our proposed method against five other competitive approaches, and the scalability to the data with thousands of classes.
AB - Linear Discriminant Analysis (LDA) is an important dimensionality reduction algorithm, but its performance is usually limited on multi-class data. Such limitation is incurred by the fact that LDA actually maximizes the average divergence among classes, whereby similar classes with smaller divergence tend to be merged in the subspace. To address this problem, we propose a novel dimensionality reduction method called Maxi-Min Discriminant Analysis (MMDA). In contrast to the traditional LDA, MMDA attempts to find a low-dimensional subspace by maximizing the minimal (worst-case) divergence among classes. This "minimal" setting overcomes the problem of LDA that tends to merge similar classes with smaller divergence when used for multi-class data. We formulate MMDA as a convex problem and further as a large-margin learning problem. One key contribution is that we design an efficient online learning algorithm to solve the involved problem, making the proposed method applicable to large scale data. Experimental results on various datasets demonstrate the efficiency and the efficacy of our proposed method against five other competitive approaches, and the scalability to the data with thousands of classes.
KW - Dimensionality reduction
KW - Handwritten Chinese character recognition
KW - Linear discriminant analysis
KW - Multi-category classification
UR - http://www.scopus.com/inward/record.url?scp=84865432835&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2012.06.001
DO - 10.1016/j.neunet.2012.06.001
M3 - Article
C2 - 22831850
AN - SCOPUS:84865432835
SN - 0893-6080
VL - 34
SP - 56
EP - 64
JO - Neural Networks
JF - Neural Networks
ER -