Hierarchical incremental class learning with output parallelism

Sheng Uei Guan*, Kai Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

The major drawback of a non-modular neural network classifier is its inability to cope with the increasing complexity of classification tasks. A modular neural network (MNN) classifier can eliminate the internal interference among hidden layers but it also ignores the useful information between classes. The hierarchical incremental class learning (HICL) scheme proposed recently for MNN classifiers further improves the performance by making use of the information between classes, but HICL still faces the presence of certain degree of harmful interference in the neural network. In this paper, we propose a new structure for modular neural network classifiers - Hierarchical Incremental Class Learning with Output Parallelism (HICL-OP), based on HICL and output parallelism. The proposed HICL-OP not only inherits the advantages of HICL, but also reduces the harmful interferences faced by HICL. The experiment results from several benchmark problems show that HICL-OP outperforms HICL and output parallelism, and it is especially effective for classification problems with multiple output attributes.

Original languageEnglish
Pages (from-to)167-193
Number of pages27
JournalJournal of Intelligent Systems
Volume16
Issue number2
DOIs
Publication statusPublished - 2007
Externally publishedYes

Keywords

  • Incremental learning
  • Modular neural networks
  • Output attributes
  • Supervised learning
  • Task decomposition

Cite this