Feature selection for modular networks based on incremental training

Sheng Uei Guan*, Jun Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)

Abstract

Feature selection plays an important role in classification systems. Using classifier error rate as the evaluation function, feature selection is integrated with incremental training. A neural network classifier is implemented with an incremental training approach to detect and discard irrelevant features. By learning attributes one after another, our classifier can find directly the attributes that make no contribution to classification. These attributes are marked and considered for removal. Incorporated with an FLD feature ranking scheme, three batch removal methods based on classifier error rate have been developed to discard irrelevant features. These feature-selection methods reduce the computational complexity involved in searching among a large number of possible solutions significantly. Experimental results show that our feature selection method works well on several benchmark problems. The selected subsets are further validated by a Constructive Backpropagation (CBP) classifier, which confirms increased classification accuracy and reduced training cost.

Original languageEnglish
Pages (from-to)353-383
Number of pages31
JournalJournal of Intelligent Systems
Volume14
Issue number4
DOIs
Publication statusPublished - 2005
Externally publishedYes

Keywords

  • Classifier
  • Feature selection
  • Feedforward neural network
  • Incremental training
  • Input attribute
  • Neural network

Cite this