TY - JOUR
T1 - Learning Imbalanced Classifiers Locally and Globally with One-Side Probability Machine
AU - Huang, Kaizhu
AU - Zhang, Rui
AU - Yin, Xu Cheng
N1 - Publisher Copyright:
© 2014, Springer Science+Business Media New York.
PY - 2015/6/1
Y1 - 2015/6/1
N2 - We consider the imbalanced learning problem, where the data associated with one class are far fewer than those associated with the other class. Current imbalanced learning methods often handle this problem by adapting certain intermediate parameters so as to impose a bias on the minority data. However, most of these methods are in rigorous and need to adapt those factors via the trial-and-error procedure. Recently, a new model called Biased Minimax Probability Machine (BMPM) presents a rigorous and systematic work and has demonstrated very promising performance on imbalance learning. Despite its success, BMPM exclusively relies on global information, namely, the first order and second order data information; such information might be however unreliable, especially for the minority data. In this paper, we propose a new model called One-Side Probability Machine (OSPM). Different from the previous approaches, OSPM can lead to rigorous treatment on biased classification tasks. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class, while engaging the robust local learning from the other side, i.e., the minority class. To our best knowledge, OSPM presents the first model capable of learning data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM, Support Vector Machine, and Maxi-Min Margin Machine. One appealing feature is that the optimization problem involved in the novel OSPM model can be cast as a convex second order conic programming problem with the global optimum guaranteed. A series of experimental results on three data sets demonstrate the advantages of our proposed methods over four competitive approaches.
AB - We consider the imbalanced learning problem, where the data associated with one class are far fewer than those associated with the other class. Current imbalanced learning methods often handle this problem by adapting certain intermediate parameters so as to impose a bias on the minority data. However, most of these methods are in rigorous and need to adapt those factors via the trial-and-error procedure. Recently, a new model called Biased Minimax Probability Machine (BMPM) presents a rigorous and systematic work and has demonstrated very promising performance on imbalance learning. Despite its success, BMPM exclusively relies on global information, namely, the first order and second order data information; such information might be however unreliable, especially for the minority data. In this paper, we propose a new model called One-Side Probability Machine (OSPM). Different from the previous approaches, OSPM can lead to rigorous treatment on biased classification tasks. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class, while engaging the robust local learning from the other side, i.e., the minority class. To our best knowledge, OSPM presents the first model capable of learning data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM, Support Vector Machine, and Maxi-Min Margin Machine. One appealing feature is that the optimization problem involved in the novel OSPM model can be cast as a convex second order conic programming problem with the global optimum guaranteed. A series of experimental results on three data sets demonstrate the advantages of our proposed methods over four competitive approaches.
KW - Classification
KW - Imbalanced learning
KW - Learning locally and globally
UR - http://www.scopus.com/inward/record.url?scp=84929061503&partnerID=8YFLogxK
U2 - 10.1007/s11063-014-9370-9
DO - 10.1007/s11063-014-9370-9
M3 - Article
AN - SCOPUS:84929061503
SN - 1370-4621
VL - 41
SP - 311
EP - 323
JO - Neural Processing Letters
JF - Neural Processing Letters
IS - 3
ER -