TY - GEN
T1 - A novel hybrid approach for combining deep and traditional neural networks
AU - Zhang, Rui
AU - Zhang, Shufei
AU - Huang, Kaizhu
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2014.
PY - 2014
Y1 - 2014
N2 - Over last fifty years, Neural Networks (NN) have been important and active models in machine learning and pattern recognition. Among different types of NNs, Back Propagation (BP) NN is one popular model, widely exploited in various applications. Recently, NNs attract even more attention in the community because a deep learning structure (if appropriately adopted) could significantly improve the learning performance. In this paper, based on a probabilistic assumption over the output neurons, we propose a hybrid strategy that manages to combine one typical deep NN, i.e., Convolutional NN (CNN) with the popular BP. We present the justification and describe the detailed learning formulations. A series of experiments validate that the hybrid approach could largely improve the accuracy for both CNN and BP on two largescale benchmark data sets, i.e., MNIST and USPS. In particular, the proposed hybrid method significantly reduced the error rates of CNN and BP respectively by 11.72% and 28.89% on MNIST.
AB - Over last fifty years, Neural Networks (NN) have been important and active models in machine learning and pattern recognition. Among different types of NNs, Back Propagation (BP) NN is one popular model, widely exploited in various applications. Recently, NNs attract even more attention in the community because a deep learning structure (if appropriately adopted) could significantly improve the learning performance. In this paper, based on a probabilistic assumption over the output neurons, we propose a hybrid strategy that manages to combine one typical deep NN, i.e., Convolutional NN (CNN) with the popular BP. We present the justification and describe the detailed learning formulations. A series of experiments validate that the hybrid approach could largely improve the accuracy for both CNN and BP on two largescale benchmark data sets, i.e., MNIST and USPS. In particular, the proposed hybrid method significantly reduced the error rates of CNN and BP respectively by 11.72% and 28.89% on MNIST.
UR - http://www.scopus.com/inward/record.url?scp=84910007548&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-12643-2_43
DO - 10.1007/978-3-319-12643-2_43
M3 - Conference Proceeding
AN - SCOPUS:84910007548
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 349
EP - 356
BT - Neural Information Processing - 21st International Conference, ICONIP 2014, Proceedings
A2 - Loo, Chu Kiong
A2 - Yap, Keem Siah
A2 - Wong, Kok Wai
A2 - Teoh, Andrew
A2 - Huang, Kaizhu
PB - Springer Verlag
T2 - 21st International Conference on Neural Information Processing, ICONIP 2014
Y2 - 3 November 2014 through 6 November 2014
ER -