A Novel Deep Density Model for Unsupervised Learning

Xi Yang, Kaizhu Huang*, Rui Zhang, John Y. Goulermas

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)


Density models are fundamental in machine learning and have received a widespread application in practical cognitive modeling tasks and learning problems. In this work, we introduce a novel deep density model, referred to as deep mixtures of factor analyzers with common loadings (DMCFA), with an efficient greedy layer-wise unsupervised learning algorithm. The model employs a mixture of factor analyzers sharing common component loadings in each layer. The common loadings can be considered to be a feature selection or reduction matrix which makes this new model more physically meaningful. Importantly, sharing common components is capable of reducing both the number of free parameters and computation complexity remarkably. Consequently, DMCFA makes inference and learning rely on a dramatically more succinct model and avoids sacrificing its flexibility in estimating the data density by utilizing Gaussian distributions as the priors. Our model is evaluated on five real datasets and compared to three other competitive models including mixtures of factor analyzers (MFA), MFA with common loadings (MCFA), deep mixtures of factor analyzers (DMFA), and their collapsed counterparts. The results demonstrate the superiority of the proposed model in the tasks of density estimation, clustering, and generation.

Original languageEnglish
Pages (from-to)778-788
Number of pages11
JournalCognitive Computation
Issue number6
Publication statusPublished - 1 Dec 2019


  • Common component factor loadings
  • Deep density model
  • Dimensionality reduction
  • Mixtures of factor analyzers

Cite this