TY - GEN
T1 - Multi-modal Neural Network for Traffic Event Detection
AU - Chen, Qi
AU - Wang, Wei
N1 - Funding Information:
ACKNOWLEDGMENT The research is funded by the Research Development Fund at Xi’an Jiaotong-Liverpool University, contract number RDF-16-01-34. We gratefully acknowledge the grant subsidy (XJTLU RIBDA2019-IRP1) provided by the Research Institute of Big Data Analytics (RIBDA), Xi'an Jiaotong-Liverpool University.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - Cities are composed of complex systems with Cyber, Physical, and Social (CPS) components. The advances in the Internet of Things (IoTs) and social networking services help people understand the dynamics of cities. Traffic event detection is an important while complex task in transportation modeling and management of smart cities. In this paper, we address the task of detecting traffic events using two types of data, i.e. physical sensor observations and social media text. Unlike most existing studies focused on either analysing sensor observations or social media data, we identify traffic events with both types of data that may complement each other. We propose a Multi-modal Neural Network (MMN) to process sensor observations and social media texts simultaneously and detect traffic events. We evaluate our model with a real-world CPS dataset consisting of sensor observations, event reports, and tweets collected from Twitter about San Francisco over a period of 4 months. The evaluation shows promising results and provides insights into the analysis of multi-modal data for detecting traffic events.
AB - Cities are composed of complex systems with Cyber, Physical, and Social (CPS) components. The advances in the Internet of Things (IoTs) and social networking services help people understand the dynamics of cities. Traffic event detection is an important while complex task in transportation modeling and management of smart cities. In this paper, we address the task of detecting traffic events using two types of data, i.e. physical sensor observations and social media text. Unlike most existing studies focused on either analysing sensor observations or social media data, we identify traffic events with both types of data that may complement each other. We propose a Multi-modal Neural Network (MMN) to process sensor observations and social media texts simultaneously and detect traffic events. We evaluate our model with a real-world CPS dataset consisting of sensor observations, event reports, and tweets collected from Twitter about San Francisco over a period of 4 months. The evaluation shows promising results and provides insights into the analysis of multi-modal data for detecting traffic events.
KW - LSTM
KW - deep learning
KW - multi-modal network
KW - recurrent neural network
KW - traffic event detection
UR - http://www.scopus.com/inward/record.url?scp=85084079673&partnerID=8YFLogxK
U2 - 10.1109/ICECE48499.2019.9058508
DO - 10.1109/ICECE48499.2019.9058508
M3 - Conference Proceeding
AN - SCOPUS:85084079673
T3 - 2019 IEEE 2nd International Conference on Electronics and Communication Engineering, ICECE 2019
SP - 26
EP - 30
BT - 2019 IEEE 2nd International Conference on Electronics and Communication Engineering, ICECE 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd IEEE International Conference on Electronics and Communication Engineering, ICECE 2019
Y2 - 9 December 2019 through 11 December 2019
ER -