LightAdam: Towards a Fast and Accurate Adaptive Momentum Online Algorithm

Yangfan Zhou, Kaizhu Huang, Cheng Cheng, Xuguang Wang, Xin Liu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

Adaptive optimization algorithms enjoy fast convergence and have been widely exploited in pattern recognition and cognitively-inspired machine learning. These algorithms may however be of high computational cost and low generalization ability due to their projection steps. Such limitations make them difficult to be applied in big data analytics, which may typically be seen in cognitively inspired learning, e.g. deep learning tasks. In this paper, we propose a fast and accurate adaptive momentum online algorithm, called LightAdam, to alleviate the drawbacks of projection steps for the adaptive algorithms. The proposed algorithm substantially reduces computational cost for each iteration step by replacing high-order projection operators with one-dimensional linear searches. Moreover, we introduce a novel second-order momentum and engage dynamic learning rate bounds in the proposed algorithm, thereby obtaining a higher generalization ability than other adaptive algorithms. We theoretically analyze that our proposed algorithm has a guaranteed convergence bound, and prove that our proposed algorithm has better generalization capability as compared to Adam.

Original languageEnglish
Pages (from-to)764-779
Number of pages16
JournalCognitive Computation
Volume14
Issue number2
DOIs
Publication statusPublished - Mar 2022

Keywords

  • Adaptive training algorithm
  • Convex optimization
  • Online learning
  • Projection-free

Fingerprint

Dive into the research topics of 'LightAdam: Towards a Fast and Accurate Adaptive Momentum Online Algorithm'. Together they form a unique fingerprint.

Cite this