An adaptive recurrent network training algorithm using IIR filter model and Lyapunov theory

Seng Kah Phooi*, Zhihong Man, H. R. Wu, Kai Ming Tse

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingChapterpeer-review


A new approach for the adaptive algorithm of a fully connected recurrent neural network (RNN) based upon the digital filter theory is proposed. Each recurrent neuron is modeled by using an infinite impulse response (IIR) filter. The weights of each layer in the RNN are updated adaptively so that the error between the desired output and the RNN output can converge to zero asymptotically. The proposed optimization method is based on the Lyapunov theory-based adaptive filtering (LAP) method [9], The merit of this adaptive algorithm can avoid computation of the dynamic derivatives that is rather complicated in the RNN. The design is independent of the stochastic properties of the input disturbances and the stability is guaranteed by the Lyapunov stability theory. Simulation example of the nonstationary time series prediction problem is performed. The simulation results have validated the fast tracking property of the proposed method.

Original languageEnglish
Title of host publicationRecent Advances in Computers, Computing and Communications
PublisherWorld Scientific and Engineering Academy and Society
Number of pages3
ISBN (Print)9608052629
Publication statusPublished - 2002
Externally publishedYes


  • IIR filter
  • Lyapunov stability theory
  • Recurrent Neural Network

Cite this