TY - JOUR
T1 - Intelligent Cruise Guidance and Vehicle Resource Management with Deep Reinforcement Learning
AU - Sun, Guolin
AU - Liu, Kai
AU - Boateng, Gordon Owusu
AU - Liu, Guisong
AU - Jiang, Wei
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2022/3/1
Y1 - 2022/3/1
N2 - The emergence of new business and technological models for urban-related transportation has revealed the need for transportation network companies (TNCs). Most research works on TNCs optimize the interests of drivers and passengers, and the operator assuming vehicle resources remain unchanged, but ignore the optimization of resource utilization and satisfaction from the perspective of flexible and controllable vehicle resources. In fact, the load of the scene is variable in time, which necessitates the flexible control of resources. Drivers wish to effectively utilize their vehicle resources to maximize profits. Passengers desire to spend minimum time waiting and the platform cares about the commission they can accrue from successful trips. In this article, we propose an adaptive intelligent cruise guidance and vehicle resource management model to balance vehicle resource utilization and request success rate, while improving platform revenue. We propose an advanced deep reinforcement learning (DRL) method to autonomously learn the statuses and guide the vehicles to hotspot areas where they can pick orders. We assume the number of online vehicles in the scene is flexible and the learning agent can autonomously change the number of online vehicles in the system according to the real-time load to improve effective vehicle resource utilization. An adaptive reward mechanism is enforced to control the importance of vehicle resource utilization and request success rate at decision steps. The simulation results and analysis reveal that our proposed DRL-based scheme balances vehicle resource utilization and request success rate at acceptable levels while improving the platform revenue, compared with other baseline algorithms.
AB - The emergence of new business and technological models for urban-related transportation has revealed the need for transportation network companies (TNCs). Most research works on TNCs optimize the interests of drivers and passengers, and the operator assuming vehicle resources remain unchanged, but ignore the optimization of resource utilization and satisfaction from the perspective of flexible and controllable vehicle resources. In fact, the load of the scene is variable in time, which necessitates the flexible control of resources. Drivers wish to effectively utilize their vehicle resources to maximize profits. Passengers desire to spend minimum time waiting and the platform cares about the commission they can accrue from successful trips. In this article, we propose an adaptive intelligent cruise guidance and vehicle resource management model to balance vehicle resource utilization and request success rate, while improving platform revenue. We propose an advanced deep reinforcement learning (DRL) method to autonomously learn the statuses and guide the vehicles to hotspot areas where they can pick orders. We assume the number of online vehicles in the scene is flexible and the learning agent can autonomously change the number of online vehicles in the system according to the real-time load to improve effective vehicle resource utilization. An adaptive reward mechanism is enforced to control the importance of vehicle resource utilization and request success rate at decision steps. The simulation results and analysis reveal that our proposed DRL-based scheme balances vehicle resource utilization and request success rate at acceptable levels while improving the platform revenue, compared with other baseline algorithms.
KW - Cruise guidance
KW - deep reinforcement learning (DRL)
KW - resource management
KW - transportation network companies (TNCs)
UR - http://www.scopus.com/inward/record.url?scp=85111028465&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2021.3098779
DO - 10.1109/JIOT.2021.3098779
M3 - Article
AN - SCOPUS:85111028465
SN - 2327-4662
VL - 9
SP - 3574
EP - 3585
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 5
ER -