Prototype learning with margin-based conditional log-likelihood loss

Xiaobo Jin*, Cheng Lin Liu, Xinwen Hou

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

5 Citations (Scopus)

Abstract

The classification performance of nearest prototype classifiers largely relies on the prototype learning algorithms, such as the learning vector quantization (LVQ) and the minimum classification error (MCE). This paper proposes a new prototype learning algorithm based on the minimization of a conditional log-likelihood loss (CLL), called log-likelihood of margin (LOGM). A regularization term is added to avoid over-fitting in training. The CLL loss in LOGM is a convex function of margin, and so, gives better convergence than the MCE algorithm. Our empirical study on a large suite of benchmark datasets demonstrates that the proposed algorithm yields higher accuracies than the MCE, the generalized LVQ (GLVQ), and the soft nearest prototype classifier (SNPC).

Original languageEnglish
Title of host publication2008 19th International Conference on Pattern Recognition, ICPR 2008
Publication statusPublished - 2008
Externally publishedYes
Event2008 19th International Conference on Pattern Recognition, ICPR 2008 - Tampa, FL, United States
Duration: 8 Dec 200811 Dec 2008

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference2008 19th International Conference on Pattern Recognition, ICPR 2008
Country/TerritoryUnited States
CityTampa, FL
Period8/12/0811/12/08

Cite this