Back-propagation with chaos

Farideh Fazayeli*, Lipo Wang, Wen Liu

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

18 Citations (Scopus)

Abstract

Multilayer feed-forward neural networks are widely used based on minimization of an error function. Back-propagation is a famous training method used in the multilayer networks but it often suffers from a local minima problem. To avoid this problem, we propose a new back-propagation training based on chaos. We investigate whether randomicity and ergodicity property of chaos can enable the learning algorithm to escape from local minima. Validity of the proposed method is examined by performing simulations on three real classification tasks, namely, the Ionosphere, the Wincson Breast Cancer (WBC), and the credit-screening datasets. The algorithm is shown to work better than the original back-propagation and is comparable with the Levenberg-Marquardt algorithm, but simpler and easier to implement comparing to Levenberg-Marquardt algorithm.

Original languageEnglish
Title of host publication2008 IEEE International Conference Neural Networks and Signal Processing, ICNNSP
Pages5-8
Number of pages4
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event2008 IEEE International Conference Neural Networks and Signal Processing, ICNNSP - Zhenjiang, China
Duration: 7 Jun 200811 Jun 2008

Publication series

Name2008 IEEE International Conference Neural Networks and Signal Processing, ICNNSP

Conference

Conference2008 IEEE International Conference Neural Networks and Signal Processing, ICNNSP
Country/TerritoryChina
CityZhenjiang
Period7/06/0811/06/08

Cite this