TY - GEN
T1 - Dynamic ensemble of ensembles in nonstationary environments
AU - Yin, Xu Cheng
AU - Huang, Kaizhu
AU - Hao, Hong Wei
PY - 2013
Y1 - 2013
N2 - Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: (1) Component classifiers and interim ensembles are dynamically trained; (2) the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn++.NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.
AB - Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: (1) Component classifiers and interim ensembles are dynamically trained; (2) the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn++.NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.
KW - Classifier ensemble
KW - Concept drift
KW - Ensemble of ensembles
KW - Growing ensemble
KW - Nonstationary environment
KW - Sparsity learning
UR - http://www.scopus.com/inward/record.url?scp=84893361484&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-42042-9_10
DO - 10.1007/978-3-642-42042-9_10
M3 - Conference Proceeding
AN - SCOPUS:84893361484
SN - 9783642420412
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 76
EP - 83
BT - Neural Information Processing - 20th International Conference, ICONIP 2013, Proceedings
T2 - 20th International Conference on Neural Information Processing, ICONIP 2013
Y2 - 3 November 2013 through 7 November 2013
ER -