Dynamic ensemble of ensembles in nonstationary environments

Xu Cheng Yin, Kaizhu Huang, Hong Wei Hao

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

4 Citations (Scopus)

Abstract

Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: (1) Component classifiers and interim ensembles are dynamically trained; (2) the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn++.NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.

Original languageEnglish
Title of host publicationNeural Information Processing - 20th International Conference, ICONIP 2013, Proceedings
Pages76-83
Number of pages8
EditionPART 2
DOIs
Publication statusPublished - 2013
Event20th International Conference on Neural Information Processing, ICONIP 2013 - Daegu, Korea, Republic of
Duration: 3 Nov 20137 Nov 2013

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume8227 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th International Conference on Neural Information Processing, ICONIP 2013
Country/TerritoryKorea, Republic of
CityDaegu
Period3/11/137/11/13

Keywords

  • Classifier ensemble
  • Concept drift
  • Ensemble of ensembles
  • Growing ensemble
  • Nonstationary environment
  • Sparsity learning

Fingerprint

Dive into the research topics of 'Dynamic ensemble of ensembles in nonstationary environments'. Together they form a unique fingerprint.

Cite this