Abstract
In many computational problems, using the Markov Chain Monte Carlo (MCMC) can be prohibitively time-consuming. We propose MCMC-Net, a simple yet efficient way to accelerate MCMC via neural networks. The key idea of our approach is to substitute the true likelihood function of the MCMC method with a neural operator based surrogate. We extensively evaluate the accuracy and speedup of our method on three different partial differential equation-based inverse problems where likelihood computations are computationally expensive, namely electrical impedance tomography, diffuse optical tomography, and quantitative photoacoustic tomography. MCMC-Net performs similar to the classical likelihood counterpart but with a significant speedup. We conjecture that the method can be applied to any problem with a sufficiently expensive likelihood function. We also analyze MCMC-Net in a theoretical setting for the different use cases. We prove a universal approximation theorem-type result to show that the proposed network can approximate the mapping resulting from forward model evaluations to a desired accuracy. Furthermore, we establish convergence of the surrogate posterior to the true posterior under Hellinger distance.
| Original language | English |
|---|---|
| Article number | 095013 |
| Journal | Inverse Problems |
| Volume | 41 |
| Issue number | 9 |
| DOIs | |
| Publication status | Published - 30 Sept 2025 |
Keywords
- Bayesian inverse problems
- convolutional neural network
- deep learning
- Markov Chain Monte Carlo