TY - JOUR
T1 - A novel classifier ensemble method with sparsity and diversity
AU - Yin, Xu Cheng
AU - Huang, Kaizhu
AU - Hao, Hong Wei
AU - Iqbal, Khalid
AU - Wang, Zhi Bin
N1 - Funding Information:
The research was partly supported by the National Basic Research Program of China ( 2012CB316301 ), the National Natural Science Foundation of China ( 61105018 , 61175020 ), and the R&D Special Fund for Public Welfare Industry (Meteorology) of China ( GYHY201106039 , GYHY201106047 ).
PY - 2014/6/25
Y1 - 2014/6/25
N2 - We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects.
AB - We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects.
KW - Classifier ensemble
KW - Diversity learning
KW - Genetic algorithm
KW - Neural network ensembles
KW - Sparsity learning
UR - http://www.scopus.com/inward/record.url?scp=84896534232&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2013.07.054
DO - 10.1016/j.neucom.2013.07.054
M3 - Article
AN - SCOPUS:84896534232
SN - 0925-2312
VL - 134
SP - 214
EP - 221
JO - Neurocomputing
JF - Neurocomputing
ER -