New MCMC samplers and application to Bayesian calibration
This talk discusses Monte Carlo methods with applications to Bayesian calibration of physical models.
We take the view that (except for low dimensional problems) Bayesian uncertainty quantification consists in producing a number of samples from the posterior distribution of the model parameters given the data. Improved sampling methods (better MCMC) make possible new applications. We discuss some MCMC methods motivated by methods for nonlinear optimization. Both optimization and MCMC may be thought of as exploring a function.
We describe some affine invariant MCMC methods. These are motivated in part by the observed robustness of Newton and Gauss-Newton optimization methods that are affine invariant. One of these methods is used in the the EMCEE package of Foreman-Mackey. Another is a Monte Carlo version of the Gauss Newton method. As for optimization, this would not work without a step size reduction strategy. We describe such a strategy that satisfies detailed balance. We also describe a method for sampling distributions on surfaces via tangential moves and projections. A “reverse check” is necessary to account for the likely event that some projections fail.
Jonathan Goodman got his PhD in math from Stanford and has been at the Courant Institute of New York University since then. His early work was on theoretical fluid mechanics and shock wave theory. Since then he has worked on financial mathematics (helping start the math finance program at NYU) and Monte Carlo methods.