TY - GEN
T1 - One-side probability machine
T2 - 20th International Conference on Neural Information Processing, ICONIP 2013
AU - Zhang, Rui
AU - Huang, Kaizhu
PY - 2013
Y1 - 2013
N2 - Imbalanced learning is a challenged task in machine learning, where the data associated with one class are far fewer than those associated with the other class. In this paper, we propose a novel model called One-Side Probability Machine (OSPM) able to learn from imbalanced data rigorously and accurately. In particular, OSPM can lead to a rigorous treatment on biased or imbalanced classification tasks, which is significantly different from previous approaches. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class , while engaging the robust local learning [2] from the other side, i.e., the minority class. Such setting proves much effective than other models such as Biased Minimax Probability Machine (BMPM). To our best knowledge, OSPM presents the first model capable of learning from imbalanced data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM and Support Vector Machine. One appealing feature is that the optimization problem involved can be cast as a convex second order conic programming problem with a global optimum guaranteed. A series of experiments on three data sets demonstrate the advantages of our proposed method against four competitive approaches.
AB - Imbalanced learning is a challenged task in machine learning, where the data associated with one class are far fewer than those associated with the other class. In this paper, we propose a novel model called One-Side Probability Machine (OSPM) able to learn from imbalanced data rigorously and accurately. In particular, OSPM can lead to a rigorous treatment on biased or imbalanced classification tasks, which is significantly different from previous approaches. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class , while engaging the robust local learning [2] from the other side, i.e., the minority class. Such setting proves much effective than other models such as Biased Minimax Probability Machine (BMPM). To our best knowledge, OSPM presents the first model capable of learning from imbalanced data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM and Support Vector Machine. One appealing feature is that the optimization problem involved can be cast as a convex second order conic programming problem with a global optimum guaranteed. A series of experiments on three data sets demonstrate the advantages of our proposed method against four competitive approaches.
UR - http://www.scopus.com/inward/record.url?scp=84893415762&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-42042-9_18
DO - 10.1007/978-3-642-42042-9_18
M3 - Conference Proceeding
AN - SCOPUS:84893415762
SN - 9783642420412
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 140
EP - 147
BT - Neural Information Processing - 20th International Conference, ICONIP 2013, Proceedings
Y2 - 3 November 2013 through 7 November 2013
ER -