Advances in multimodal data fusion in neuroimaging: Overview, challenges, and novel orientation

Yu Dong Zhang*, Zhengchao Dong, Shui Hua Wang, Xiang Yu, Xujing Yao, Qinghua Zhou, Hua Hu, Min Li, Carmen Jiménez-Mesa, Javier Ramirez, Francisco J. Martinez, Juan Manuel Gorriz

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

274 Citations (Scopus)

Abstract

Multimodal fusion in neuroimaging combines data from multiple imaging modalities to overcome the fundamental limitations of individual modalities. Neuroimaging fusion can achieve higher temporal and spatial resolution, enhance contrast, correct imaging distortions, and bridge physiological and cognitive information. In this study, we analyzed over 450 references from PubMed, Google Scholar, IEEE, ScienceDirect, Web of Science, and various sources published from 1978 to 2020. We provide a review that encompasses (1) an overview of current challenges in multimodal fusion (2) the current medical applications of fusion for specific neurological diseases, (3) strengths and limitations of available imaging modalities, (4) fundamental fusion rules, (5) fusion quality assessment methods, and (6) the applications of fusion for atlas-based segmentation and quantification. Overall, multimodal fusion shows significant benefits in clinical diagnosis and neuroscience research. Widespread education and further research amongst engineers, researchers and clinicians will benefit the field of multimodal neuroimaging.

Original languageEnglish
Pages (from-to)149-187
Number of pages39
JournalInformation Fusion
Volume64
DOIs
Publication statusPublished - Dec 2020
Externally publishedYes

Keywords

  • Applications
  • Assessment
  • Fusion rules
  • Magnetic resonance imaging
  • Multimodal data fusion
  • Neuroimaging
  • PET
  • Partial volume effect
  • SPECT

Fingerprint

Dive into the research topics of 'Advances in multimodal data fusion in neuroimaging: Overview, challenges, and novel orientation'. Together they form a unique fingerprint.

Cite this