TY - JOUR
T1 - WaterScenes
T2 - A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmarks for Autonomous Driving on Water Surfaces
AU - Yao, Shanliang
AU - Guan, Runwei
AU - Wu, Zhaodong
AU - Ni, Yi
AU - Huang, Zile
AU - Liu, Wen
AU - Yue, Yong
AU - Ding, Weiping
AU - Lim, Eng Gee
AU - Seo, Hyungjoon
AU - Man, Ka Lok
AU - Ma, Jieming
AU - Zhu, Xiaohui
AU - Yue, Yutao
N1 - Publisher Copyright:
IEEE
PY - 2024/6/26
Y1 - 2024/6/26
N2 - Autonomous driving on water surfaces plays an essential role in executing hazardous and time-consuming missions, such as maritime surveillance, survivor rescue, environmental monitoring, hydrography mapping and waste cleaning. This work presents WaterScenes, the first multi-task 4D radar-camera fusion dataset for autonomous driving on water surfaces. Equipped with a 4D radar and a monocular camera, our Unmanned Surface Vehicle (USV) proffers all-weather solutions for discerning object-related information, including color, shape, texture, range, velocity, azimuth, and elevation. Focusing on typical static and dynamic objects on water surfaces, we label the camera images and radar point clouds at pixel-level and point-level, respectively. In addition to basic perception tasks, such as object detection, instance segmentation and semantic segmentation, we also provide annotations for free-space segmentation and waterline segmentation. Leveraging the multi-task and multi-modal data, we conduct benchmark experiments on the uni-modality of radar and camera, as well as the fused modalities. Experimental results demonstrate that 4D radar-camera fusion can considerably improve the accuracy and robustness of perception on water surfaces, especially in adverse lighting and weather conditions. WaterScenes dataset is public on https://waterscenes.github.io.
AB - Autonomous driving on water surfaces plays an essential role in executing hazardous and time-consuming missions, such as maritime surveillance, survivor rescue, environmental monitoring, hydrography mapping and waste cleaning. This work presents WaterScenes, the first multi-task 4D radar-camera fusion dataset for autonomous driving on water surfaces. Equipped with a 4D radar and a monocular camera, our Unmanned Surface Vehicle (USV) proffers all-weather solutions for discerning object-related information, including color, shape, texture, range, velocity, azimuth, and elevation. Focusing on typical static and dynamic objects on water surfaces, we label the camera images and radar point clouds at pixel-level and point-level, respectively. In addition to basic perception tasks, such as object detection, instance segmentation and semantic segmentation, we also provide annotations for free-space segmentation and waterline segmentation. Leveraging the multi-task and multi-modal data, we conduct benchmark experiments on the uni-modality of radar and camera, as well as the fused modalities. Experimental results demonstrate that 4D radar-camera fusion can considerably improve the accuracy and robustness of perception on water surfaces, especially in adverse lighting and weather conditions. WaterScenes dataset is public on https://waterscenes.github.io.
KW - 4D radar-camera fusion
KW - Autonomous driving
KW - Autonomous vehicles
KW - Cameras
KW - Meteorology
KW - multi-task
KW - Multitasking
KW - Radar
KW - Radar imaging
KW - Task analysis
KW - unmanned surface vehicle
UR - http://www.scopus.com/inward/record.url?scp=85197611307&partnerID=8YFLogxK
U2 - 10.1109/TITS.2024.3415772
DO - 10.1109/TITS.2024.3415772
M3 - Article
AN - SCOPUS:85197611307
SN - 1524-9050
SP - 1
EP - 15
JO - IEEE Transactions on Intelligent Transportation Systems
JF - IEEE Transactions on Intelligent Transportation Systems
ER -