Abstract
Task decomposition with pattern distributor (PD) is a new task decomposition method for multilayered feedforward neural networks (NNs). Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named reduced pattern training (RPT) is also introduced, aiming to improve the performance of pattern distribution. Our analysis and the experimental results show that RPT improves the performance of pattern distributor network significantly. The distributor module's classification accuracy dominates the whole network's performance. Two combination methods, namely, crosstalk-based combination and genetic-algorithm (GA)-based combination, are presented to find suitable grouping for the distributor module. Experimental results show that this new method can reduce training time and improve network generalization accuracy when compared to a conventional method such as constructive backpropagation or a task decomposition method such as output parallelism (OP).
Original language | English |
---|---|
Pages (from-to) | 1738-1749 |
Number of pages | 12 |
Journal | IEEE Transactions on Neural Networks |
Volume | 18 |
Issue number | 6 |
DOIs | |
Publication status | Published - Nov 2007 |
Externally published | Yes |
Keywords
- Crosstalk-based combination
- Full pattern training (FPT)
- Genetic-algorithm-based combination
- Pattern distributor
- Reduced pattern training (RPT)
- Task decomposition