GNGTS 2023 - Atti del 41° Convegno Nazionale
Session 3.3 ______ ___ GNGTS 2023 To efficiently sample the posterior distribution, we introduce a sampling algorithm in which the proposal distribution is constructed by the local gradient and the Hessian of the negative log posterior. For non-linear problems the Bayesian inversion is often solved through a Markov Chain Monte Carlo (MCMC) sampling. Monte Carlo is a technique for randomly sampling a probability distribution. Markov chain is a systematic method for generating a sequence of random variables where the current value is probabilistically dependent only on the previous state of the chain. Combining these two methods, allows random sampling of high dimensional probability distributions that honors the probabilistic dependence between samples by constructing a Markov Chain that comprises the Monte Carlo sample. Our algorithm is called gradient-based Markov chain Monte Carlo (GB-MCMC). The GB-MCMC elastic FWI method can quantify inversion uncertainties with estimated posterior distributions given sufficiently long Markov chains. MCMC sampling methods provide the global view of the model space, so the inversion avoids the entrapment in a local region. Theoretically speaking, GB-MCMC method can accurately estimate the posterior distribution given sufficiently long Markov chains with arbitrary starting points. However, expensive forward model operators and high-dimensional parameter spaces make the application of MCMC algorithms computationally unfeasible. A suitable strategy to reduce the computational complexity of this type of inverse problem is to compress the model space through appropriate reparameterization techniques, in order to reduce the number of data points and model parameters and hence the dimensions of H a and g . In this work, we propose a GB-MCMC elastic FWI method combined with the compression of data and model space through a discrete cosine transform (DCT) and we apply this strategy to a 2D synthetic model with one strong vertical and some lateral velocity variations. METHOD In this work we have employed a gradient-based MCMC sampling algorithm combined with a DCT to reduce the model and data spaces. Further details can be found in Zhao and Sen (2021) or Aleardi et al. (2021), who applied the method to probabilistically solve the electrical resistivity tomography. With this sampling algorithm, the proposal is constructed by the local gradient and the Hessian of the negative log posterior. Using this information, we speed up the convergence of the probabilistic sampling because the proposed model is drawn from a local approximation of the target PPD. The ensemble of sampled states after the burn-in period (the first iterations, corresponding to the beginning of the chains where the algorithm moves towards a promising portion of the search space, are discarded from the computation of the PPD) is used to numerically compute the statistical properties (e.g. mean and standard deviation) of the target posterior probability.
Made with FlippingBook
RkJQdWJsaXNoZXIy MjQ4NzI=