Abstract
The major drawback of a non-modular neural network classifier is its inability to cope with the increasing complexity of classification tasks. A modular neural network (MNN) classifier can eliminate the internal interference among hidden layers but it also ignores the useful information between classes. The hierarchical incremental class learning (HICL) scheme proposed recently for MNN classifiers further improves the performance by making use of the information between classes, but HICL still faces the presence of certain degree of harmful interference in the neural network. In this paper, we propose a new structure for modular neural network classifiers - Hierarchical Incremental Class Learning with Output Parallelism (HICL-OP), based on HICL and output parallelism. The proposed HICL-OP not only inherits the advantages of HICL, but also reduces the harmful interferences faced by HICL. The experiment results from several benchmark problems show that HICL-OP outperforms HICL and output parallelism, and it is especially effective for classification problems with multiple output attributes.
Original language | English |
---|---|
Pages (from-to) | 167-193 |
Number of pages | 27 |
Journal | Journal of Intelligent Systems |
Volume | 16 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2007 |
Externally published | Yes |
Keywords
- Incremental learning
- Modular neural networks
- Output attributes
- Supervised learning
- Task decomposition