Hierarchical incremental vlass learning with reduced pattern training

Sheng Uei Guan*, Chunyu Bao, Ru Tian Sun

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

Hierarchical Incremental Class Learning (HICL) is a new task decomposition method that addresses the pattern classification problem. The HICL is proven to be a good classifier but closer examination reveals areas for potential improvement. This paper proposes a theoretical model to evaluate the performance of HICL and presents an approach to improve the classification accuracy of HICL by applying the concept of Reduced Pattern Training (RPT). The theoretical analysis shows that HICL can achieve better classification accuracy than Output Parallelism [Guan and Li: IEEE Transaction on Neural Networks, 13 (2002), 542-550]. The procedure for RPT is described and compared with the original training procedure. The RPT reduces systematically the size of the training data set based on the order of sub-networks built. The results from four benchmark classification problems show much promise for the improved model.

Original languageEnglish
Pages (from-to)163-177
Number of pages15
JournalNeural Processing Letters
Volume24
Issue number2
DOIs
Publication statusPublished - Oct 2006
Externally publishedYes

Keywords

  • Classifier systems
  • Hierarchical learning
  • Instance selection
  • Output Parallelism
  • Reduced pattern training

Cite this