External multi-modal imaging sensor calibration for sensor fusion: A review

Zhouyan Qiu*, Joaquín Martínez-Sánchez, Pedro Arias, Rabia Rashdi

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review

19 Citations (Scopus)

Abstract

Multi-modal data fusion has gained popularity due to its diverse applications, leading to an increased demand for external sensor calibration. Despite several proven calibration solutions, they fail to fully satisfy all the evaluation criteria, including accuracy, automation, and robustness. Thus, this review aims to contribute to this growing field by examining recent research on multi-modal imaging sensor calibration and proposing future research directions. The literature review comprehensively explains the various characteristics and conditions of different multi-modal external calibration methods, including traditional motion-based calibration and feature-based calibration. Target-based calibration and targetless calibration are two types of feature-based calibration, which are discussed in detail. Furthermore, the paper highlights systematic calibration as an emerging research direction. Finally, this review concludes crucial factors for evaluating calibration methods and provides a comprehensive discussion on their applications, with the aim of providing valuable insights to guide future research directions. Future research should focus primarily on the capability of online targetless calibration and systematic multi-modal sensor calibration.
Original languageEnglish
JournalInformation Fusion
Volume97
DOIs
Publication statusPublished - 1 Sept 2023
Externally publishedYes

Keywords

  • Multi-modal
  • Sensor calibration
  • LiDAR
  • Camera
  • Sensor fusion
  • Mobile mapping

Fingerprint

Dive into the research topics of 'External multi-modal imaging sensor calibration for sensor fusion: A review'. Together they form a unique fingerprint.

Cite this