Maximum Gaussian mixture model for classification

  • Jiehao Zhang
  • , Xianbin Hong
  • , Sheng Uei Guan
  • , Xuan Zhao
  • , Xin Huang
  • , Nian Xue

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

14 Citations (Scopus)

Abstract

There are a variety of models and algorithms that solves classification problems. Among these models, Maximum Gaussian Mixture Model (MGMM) is a model we proposed earlier that describes data using the maximum value of Gaussians. Expectation Maximization (EM) algorithm can be used to solve this model. In this paper, we propose a multiEM approach to solve MGMM and to train MGMM based classifiers. This approach combines multiple MGMMs solved by EM into a classifier. The classifiers trained with this approach on both artificial and real life datasets were tested to have good performance with 10-fold cross validation.

Original languageEnglish
Title of host publicationProceedings - 2016 8th International Conference on Information Technology in Medicine and Education, ITME 2016
EditorsYing Dai, Shaozi Li, Yun Cheng
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages587-591
Number of pages5
ISBN (Electronic)9781509039050
DOIs
Publication statusPublished - 12 Jul 2017
Event8th International Conference on Information Technology in Medicine and Education, ITME 2016 - Fuzhou, China
Duration: 23 Dec 201625 Dec 2016

Publication series

NameProceedings - 2016 8th International Conference on Information Technology in Medicine and Education, ITME 2016

Conference

Conference8th International Conference on Information Technology in Medicine and Education, ITME 2016
Country/TerritoryChina
CityFuzhou
Period23/12/1625/12/16

Keywords

  • Classification
  • Maximum Gaussian mixture model

Fingerprint

Dive into the research topics of 'Maximum Gaussian mixture model for classification'. Together they form a unique fingerprint.

Cite this