Incremental training based on input space partitioning and ordered attribute presentation with backward elimination

Sheng Uei Guan*, Ji Hua Ang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


A neural network training method ID-BT (Incremental Discriminatory Batch Training) is presented in this paper. The method separates the input space into two batches - significant and insignificant attributes-and before introducing them into the network, orders the attributes within each batch according to their individual discrimination ability. By backward eliminating insignificant attributes that are futile, the generalization accuracy of network training is increased. Incremental Discriminatory Batch and Individual Training (ID-BIT), which further improves ID-BT, introduces significant attributes individually and insignificant attributes as a batch. The architecture used for both methods employs several incremental learning algorithms. We tested our algorithm extensively, using several widely used benchmark problems, i.e. PROBEN1. The simulation results show that these two methods outperform incremental training with an increasing input dimension or conventional batch training where no partitioning of neural network input space occurs; we can achieve better network performance in terms of generalization accuracy.

Original languageEnglish
Pages (from-to)321-351
Number of pages31
JournalJournal of Intelligent Systems
Issue number4
Publication statusPublished - 2005
Externally publishedYes


  • Backward elimination
  • Data presentation order
  • Incremental training
  • Input space partitioning
  • Neural networks

Cite this