TY - JOUR
T1 - MVG-Splatting
T2 - Multi-view guided Gaussian Splatting with adaptive quantile-based geometric consistency densification
AU - Li, Zhuoxiao
AU - Yao, Shanliang
AU - Chu, Yijie
AU - García-Fernández, Ángel F.
AU - Yue, Yong
AU - Ding, Weiping
AU - Zhu, Xiaohui
N1 - Publisher Copyright:
© 2025
PY - 2026/2
Y1 - 2026/2
N2 - In the rapidly evolving field of image-fusion-based 3D reconstruction, 3D Gaussian Splatting (3DGS) and 2D Gaussian Splatting (2DGS) represent significant advancements. Although 2DGS compresses 3D Gaussian primitives into 2D Gaussian surfels to effectively enhance mesh extraction quality, this compression can potentially lead to a decrease in rendering quality. Additionally, unreliable densification processes and the calculation of depth through the accumulation of opacity can compromise the detail of mesh extraction. Specifically, we integrate an optimized method for calculating normals, which, combined with image gradients, helps rectify inconsistencies in the original depth computations. Additionally, utilizing projection strategies akin to those in Multi-View Stereo (MVS), we propose an adaptive quantile-based method that dynamically determines the level of additional densification guided by depth maps, from coarse to fine detail. Furthermore, we design a joint loss function that combines edge-aware and feature-aware depth constraints to ensure that our refined depth aligns well with the ground-truth image edges and features, thereby improving both photometric and geometric consistency. Experimental evidence demonstrates that our method not only resolves the issues of rendering quality degradation caused by depth discrepancies but also facilitates direct mesh extraction from denser Gaussian point clouds using the Marching Cubes algorithm. This approach significantly enhances the overall fidelity and accuracy of the 3D reconstruction process, ensuring that both the geometric details and visual quality. The project is available at https://mvgsplatting.github.io/.
AB - In the rapidly evolving field of image-fusion-based 3D reconstruction, 3D Gaussian Splatting (3DGS) and 2D Gaussian Splatting (2DGS) represent significant advancements. Although 2DGS compresses 3D Gaussian primitives into 2D Gaussian surfels to effectively enhance mesh extraction quality, this compression can potentially lead to a decrease in rendering quality. Additionally, unreliable densification processes and the calculation of depth through the accumulation of opacity can compromise the detail of mesh extraction. Specifically, we integrate an optimized method for calculating normals, which, combined with image gradients, helps rectify inconsistencies in the original depth computations. Additionally, utilizing projection strategies akin to those in Multi-View Stereo (MVS), we propose an adaptive quantile-based method that dynamically determines the level of additional densification guided by depth maps, from coarse to fine detail. Furthermore, we design a joint loss function that combines edge-aware and feature-aware depth constraints to ensure that our refined depth aligns well with the ground-truth image edges and features, thereby improving both photometric and geometric consistency. Experimental evidence demonstrates that our method not only resolves the issues of rendering quality degradation caused by depth discrepancies but also facilitates direct mesh extraction from denser Gaussian point clouds using the Marching Cubes algorithm. This approach significantly enhances the overall fidelity and accuracy of the 3D reconstruction process, ensuring that both the geometric details and visual quality. The project is available at https://mvgsplatting.github.io/.
KW - 3D Gaussian Splatting
KW - 3D reconstruction
KW - Multi-view images fusion
UR - https://www.scopus.com/pages/publications/105011500268
U2 - 10.1016/j.inffus.2025.103540
DO - 10.1016/j.inffus.2025.103540
M3 - Article
AN - SCOPUS:105011500268
SN - 1566-2535
VL - 126
JO - Information Fusion
JF - Information Fusion
M1 - 103540
ER -