TY - GEN
T1 - Image classification via support vector machine
AU - Sun, Xiaowu
AU - Liu, Lizhen
AU - Wang, Hanshi
AU - Song, Wei
AU - Lu, Jingli
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2016/6/13
Y1 - 2016/6/13
N2 - With the rapid growth of images information, how to classify the images has been a main problem, and most of researchers are concerning on the neural networks to realize the images classification. However, the neural networks can not escape from its own limitations including the local optimum or the dependence on the input sample data. In this paper, another new algorithm named support vector machine, whose main idea is to build a hyperplane as the decision surface, is introduced to solve the problems. In the theory part, in order to solve the optimal hyperplane for the separable patterns problem, the method of Lagrange multiplier is transformed into its dual problem. In the application section, where it proves that the support vector machine can solve the problem of classification perfectly, with regard to the input data, the eigenvalues of the images' gray information which are treated by the method of Principal Component Analysis are abstracted as input sample. It is found that the precision of the classification could arrive at 89.66%, which is far higher than the neural networks' 41.38%.
AB - With the rapid growth of images information, how to classify the images has been a main problem, and most of researchers are concerning on the neural networks to realize the images classification. However, the neural networks can not escape from its own limitations including the local optimum or the dependence on the input sample data. In this paper, another new algorithm named support vector machine, whose main idea is to build a hyperplane as the decision surface, is introduced to solve the problems. In the theory part, in order to solve the optimal hyperplane for the separable patterns problem, the method of Lagrange multiplier is transformed into its dual problem. In the application section, where it proves that the support vector machine can solve the problem of classification perfectly, with regard to the input data, the eigenvalues of the images' gray information which are treated by the method of Principal Component Analysis are abstracted as input sample. It is found that the precision of the classification could arrive at 89.66%, which is far higher than the neural networks' 41.38%.
KW - hyperplane
KW - image classification
KW - neural networks
KW - principal component analysis
KW - support vector machine
UR - https://www.scopus.com/pages/publications/84979257073
U2 - 10.1109/ICCSNT.2015.7490795
DO - 10.1109/ICCSNT.2015.7490795
M3 - Conference Proceeding
AN - SCOPUS:84979257073
T3 - Proceedings of 2015 4th International Conference on Computer Science and Network Technology, ICCSNT 2015
SP - 485
EP - 489
BT - Proceedings of 2015 4th International Conference on Computer Science and Network Technology, ICCSNT 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 4th International Conference on Computer Science and Network Technology, ICCSNT 2015
Y2 - 19 December 2015 through 20 December 2015
ER -