Improved Stereo Matching Algorithm Based on Sparse Window

Qi Tang, Yuanping Xu*, Jiliu Zhou, Chao Kong, Jin Jin, Zhijie Xu, Chaolong Zhang, Yajing Shi

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

This study presents a sparse window-based stereo-matching algorithm that enhances the accuracy and efficiency of the semi-global matching algorithm. Unlike traditional methods, this algorithm processes pixel areas based on their texture features, resulting in more efficient encoding. The proposed approach systematically samples pixels within the original encoding window to reduce the number of pixels involved in the process. Additionally, using the FAST feature detection method distinguishes texture areas and applies different encoding processes for each area to obtain the feature encoding of the center pixels. Experimental results show that compared with traditional semi-global stereo matching algorithms, our proposed sparse window-based algorithm improves processing speed by 0.06 seconds and reduces average error by 10.92%.

Original languageEnglish
Title of host publicationICAC 2023 - 28th International Conference on Automation and Computing
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350335859
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event28th International Conference on Automation and Computing, ICAC 2023 - Birmingham, United Kingdom
Duration: 30 Aug 20231 Sept 2023

Publication series

NameICAC 2023 - 28th International Conference on Automation and Computing

Conference

Conference28th International Conference on Automation and Computing, ICAC 2023
Country/TerritoryUnited Kingdom
CityBirmingham
Period30/08/231/09/23

Keywords

  • census transform
  • matching costs
  • semi-global matching
  • stereo vision
  • systematic sampling

Fingerprint

Dive into the research topics of 'Improved Stereo Matching Algorithm Based on Sparse Window'. Together they form a unique fingerprint.

Cite this