TY - GEN
T1 - Deep mixtures of factor analyzers with common loadings
T2 - 24th International Conference on Neural Information Processing, ICONIP 2017
AU - Yang, Xi
AU - Huang, Kaizhu
AU - Zhang, Rui
N1 - Publisher Copyright:
© Springer International Publishing AG 2017.
PY - 2017
Y1 - 2017
N2 - In this paper, we propose a novel deep density model, called Deep Mixtures of Factor Analyzers with Common Loadings (DMCFA). Employing a mixture of factor analyzers sharing common component loadings, this novel model is more physically meaningful, since the common loadings can be regarded as feature selection or reduction matrices. Importantly, the novel DMCFA model is able to remarkably reduce the number of free parameters, making the involved inferences and learning problem dramatically easier. Despite its simplicity, by engaging learnable Gaussian distributions as the priors, DMCFA does not sacrifice its flexibility in estimating the data density. This is particularly the case when compared with the existing model Deep Mixtures of Factor Analyzers (DMFA), exploiting different loading matrices but simple standard Gaussian distributions for each component prior. We evaluate the performance of the proposed DMCFA in comparison with three other competitive models including Mixtures of Factor Analyzers (MFA), MCFA, and DMFA and their shallow counterparts. Results on four real data sets show that the novel model demonstrates significantly better performance in both density estimation and clustering.
AB - In this paper, we propose a novel deep density model, called Deep Mixtures of Factor Analyzers with Common Loadings (DMCFA). Employing a mixture of factor analyzers sharing common component loadings, this novel model is more physically meaningful, since the common loadings can be regarded as feature selection or reduction matrices. Importantly, the novel DMCFA model is able to remarkably reduce the number of free parameters, making the involved inferences and learning problem dramatically easier. Despite its simplicity, by engaging learnable Gaussian distributions as the priors, DMCFA does not sacrifice its flexibility in estimating the data density. This is particularly the case when compared with the existing model Deep Mixtures of Factor Analyzers (DMFA), exploiting different loading matrices but simple standard Gaussian distributions for each component prior. We evaluate the performance of the proposed DMCFA in comparison with three other competitive models including Mixtures of Factor Analyzers (MFA), MCFA, and DMFA and their shallow counterparts. Results on four real data sets show that the novel model demonstrates significantly better performance in both density estimation and clustering.
KW - Common component factor loadings
KW - Deep density model
KW - Dimensionality reduction
KW - Mixtures of factor analyzers
UR - http://www.scopus.com/inward/record.url?scp=85035079392&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-70087-8_73
DO - 10.1007/978-3-319-70087-8_73
M3 - Conference Proceeding
AN - SCOPUS:85035079392
SN - 9783319700861
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 709
EP - 719
BT - Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings
A2 - Li, Yuanqing
A2 - Liu, Derong
A2 - Xie, Shengli
A2 - El-Alfy, El-Sayed M.
A2 - Zhao, Dongbin
PB - Springer Verlag
Y2 - 14 November 2017 through 18 November 2017
ER -