TY - JOUR
T1 - Towards Deep Radar Perception for Autonomous Driving
T2 - Datasets, Methods, and Challenges
AU - Zhou, Yi
AU - Liu, Lulu
AU - Zhao, Haocheng
AU - López-Benítez, Miguel
AU - Yu, Limin
AU - Yue, Yutao
N1 - Funding Information:
Funding: This research was partially funded by the XJTLU-JITRI Academy of Industrial Technology, the Institute of Deep Perception Technology (IDPT), and the Research Enhancement Fund of XJTLU (REF-19-01-04).
Publisher Copyright:
© 2022 by the authors. Licensee MDPI, Basel, Switzerland.
PY - 2022/6/1
Y1 - 2022/6/1
N2 - With recent developments, the performance of automotive radar has improved significantly. The next generation of 4D radar can achieve imaging capability in the form of high-resolution point clouds. In this context, we believe that the era of deep learning for radar perception has arrived. However, studies on radar deep learning are spread across different tasks, and a holistic overview is lacking. This review paper attempts to provide a big picture of the deep radar perception stack, including signal processing, datasets, labelling, data augmentation, and downstream tasks such as depth and velocity estimation, object detection, and sensor fusion. For these tasks, we focus on explaining how the network structure is adapted to radar domain knowledge. In particular, we summarise three overlooked challenges in deep radar perception, including multi-path effects, uncertainty problems, and adverse weather effects, and present some attempts to solve them.
AB - With recent developments, the performance of automotive radar has improved significantly. The next generation of 4D radar can achieve imaging capability in the form of high-resolution point clouds. In this context, we believe that the era of deep learning for radar perception has arrived. However, studies on radar deep learning are spread across different tasks, and a holistic overview is lacking. This review paper attempts to provide a big picture of the deep radar perception stack, including signal processing, datasets, labelling, data augmentation, and downstream tasks such as depth and velocity estimation, object detection, and sensor fusion. For these tasks, we focus on explaining how the network structure is adapted to radar domain knowledge. In particular, we summarise three overlooked challenges in deep radar perception, including multi-path effects, uncertainty problems, and adverse weather effects, and present some attempts to solve them.
KW - automotive radars
KW - autonomous driving
KW - deep learning
KW - multi-sensor fusion
KW - object detection
KW - radar signal processing
UR - http://www.scopus.com/inward/record.url?scp=85131703088&partnerID=8YFLogxK
U2 - 10.3390/s22114208
DO - 10.3390/s22114208
M3 - Review article
C2 - 35684831
AN - SCOPUS:85131703088
SN - 1424-8220
VL - 22
JO - Sensors
JF - Sensors
IS - 11
M1 - 4208
ER -