From Perception to Path Planning: A Modular End-to-End Collaborative Architecture for AVPC

Gordon Owusu Boateng, Xinhao Liu, Xiansheng Guo, Zhao Wang, Azzam Mourad, Mohsen Guizani

Research output: Contribution to journalArticle

Abstract

The rapid adoption of Autonomous Electric Vehicles
(AEVs) and the global expansion of EV-charging infrastructure
are driving demand for intelligent Automated Valet
Parking and Charging (AVPC) systems. Effective AVPC requires
accurate environmental perception to support efficient
scheduling of parking resources and safe navigation of AEVs
in complex and dynamic parking environments. However,
existing approaches often rely solely on static infrastructureside
sensing, which limits perception accuracy and results in
suboptimal scheduling and path planning. This article proposes
a collaborative vehicle-infrastructure perception architecture
for End-to-End (E2E) parking resource scheduling and AEV
path planning in AVPC systems. The architecture comprises
three key modules: a perception module that fuses multisensor
sensing data from AEV-end and infrastructure-end
sensors to detect parking resource occupancy statuses and
classify the resources into types; a scheduling module that
leverages the fused perception results to optimize parking
resource assignments to AEVs; and a path planning module
that enables safe, collision-free navigation routes of AEVs to
designated parking resources. The collaborative architecture
enhances situational awareness, improves resource utilization
efficiency, and facilitates a scalable foundation for real-time
AVPC deployment. Simulation results demonstrate the efficacy
of the proposed architecture in terms of perception accuracy
and collision-free path planning in a typical AVPC setting.
Original languageEnglish
JournalIEEE Communications Magazine
Publication statusIn preparation - 2025

Fingerprint

Dive into the research topics of 'From Perception to Path Planning: A Modular End-to-End Collaborative Architecture for AVPC'. Together they form a unique fingerprint.

Cite this