Joint Sparse Regularization for Dictionary Learning

Jianyu Miao, Heling Cao, Xiao Bo Jin, Rongrong Ma, Xuan Fei, Lingfeng Niu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

As a powerful data representation framework, dictionary learning has emerged in many domains, including machine learning, signal processing, and statistics. Most existing dictionary learning methods use the ℓ0 or ℓ1 norm as regularization to promote sparsity, which neglects the redundant information in dictionary. In this paper, a class of joint sparse regularization is introduced to dictionary learning, leading to a compact dictionary. Unlike previous works which obtain sparse representations independently, we consider all representations in dictionary simultaneously. An efficient iterative solver based on ConCave-Convex Procedure (CCCP) framework and Lagrangian dual is developed to tackle the resulting model. Further, based on the dictionary learning with joint sparse regularization, we consider the multi-layer structure, which can extract the more abstract representation of data. Numerical experiments are conducted on several publicly available datasets. The experimental results demonstrate the effectiveness of joint sparse regularization for dictionary learning.

Original languageEnglish
Pages (from-to)697-710
Number of pages14
JournalCognitive Computation
Volume11
Issue number5
DOIs
Publication statusPublished - 1 Oct 2019

Keywords

  • Dictionary learning
  • Joint sparse regularization
  • Multi-layer structure

Cite this