TY - GEN
T1 - Achelous: A fast unified water-surface panoptic perception framework based on fusion of monocular camera and 4d mmwave radar
T2 - 26th IEEE International Conference on Intelligent Transportation Systems, ITSC 2023
AU - Guan, Runwei
AU - Yao, Shanliang
AU - Zhu, Xiaohui
AU - Man, Ka Lok
AU - Lim, Eng Gee
AU - Smith, Jeremy S.
AU - Yue, Yong
AU - Yue, Yutao
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023/7/14
Y1 - 2023/7/14
N2 - Current perception models for different tasks usually exist in modular forms on Unmanned Surface Vehicles (USVs), which infer extremely slowly in parallel on edge devices, causing the asynchrony between perception results and USV position, and leading to error decisions of autonomous navigation. Compared with Unmanned Ground Vehicles (UGVs), the robust perception of USVs develops relatively slowly. Moreover, most current multi-task perception models are huge in parameters, slow in inference and not scalable. Oriented on this, we propose Achelous, a low-cost and fast unified panoptic perception framework for water-surface perception based on the fusion of a monocular camera and 4D mmWave radar. Achelous can simultaneously perform five tasks, detection and segmentation of visual targets, drivable-area segmentation, waterline segmentation and radar point cloud segmentation. Besides, models in Achelous family, with less than around 5 million parameters, achieve about 18 FPS on an NVIDIA Jetson AGX Xavier, 11 FPS faster than HybridNets, and exceed YOLOX-Tiny and Segformer-B0 on our collected dataset about 5 mAP50-95 and 0.7 mIoU, especially under situations of adverse weather, dark environments and camera failure. To our knowledge, Achelous is the first comprehensive panoptic perception framework combining vision-level and point-cloud-level tasks for water-surface perception. To promote the development of the intelligent transportation community, we release our codes in https://github.com/GuanRunwei/Achelous.
AB - Current perception models for different tasks usually exist in modular forms on Unmanned Surface Vehicles (USVs), which infer extremely slowly in parallel on edge devices, causing the asynchrony between perception results and USV position, and leading to error decisions of autonomous navigation. Compared with Unmanned Ground Vehicles (UGVs), the robust perception of USVs develops relatively slowly. Moreover, most current multi-task perception models are huge in parameters, slow in inference and not scalable. Oriented on this, we propose Achelous, a low-cost and fast unified panoptic perception framework for water-surface perception based on the fusion of a monocular camera and 4D mmWave radar. Achelous can simultaneously perform five tasks, detection and segmentation of visual targets, drivable-area segmentation, waterline segmentation and radar point cloud segmentation. Besides, models in Achelous family, with less than around 5 million parameters, achieve about 18 FPS on an NVIDIA Jetson AGX Xavier, 11 FPS faster than HybridNets, and exceed YOLOX-Tiny and Segformer-B0 on our collected dataset about 5 mAP50-95 and 0.7 mIoU, especially under situations of adverse weather, dark environments and camera failure. To our knowledge, Achelous is the first comprehensive panoptic perception framework combining vision-level and point-cloud-level tasks for water-surface perception. To promote the development of the intelligent transportation community, we release our codes in https://github.com/GuanRunwei/Achelous.
KW - Fusion of vision and radar
KW - Unified panoptic perception
KW - USV-based water-surface perception
UR - http://www.scopus.com/inward/record.url?scp=85186502342&partnerID=8YFLogxK
U2 - 10.1109/ITSC57777.2023.10422325
DO - 10.1109/ITSC57777.2023.10422325
M3 - Conference Proceeding
AN - SCOPUS:85186502342
T3 - IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC
SP - 182
EP - 188
BT - 2023 IEEE 26th International Conference on Intelligent Transportation Systems, ITSC 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 24 September 2023 through 28 September 2023
ER -