17. Overview of Part III: Sampling#
“Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin.”
—John von Neumann (1951)
We have already seem examples of the sampling of PDFs in previous chapters. Here we look in depth at Markov chain Monte Carlo (MCMC), which is the workhorse of sampling methods. We will give an overview of both the theory and practice, considering first the Random Walk Metropolis-Hastings algorithm and then other more efficient samplers.
Chapters in this part:
Intuition for MCMC gives a general motivation for MCMC sampling, builds intuition through visualizations and analogies to statistical mechanics, introduces the Metropolis-Hastings algorithm, and provides some basic examples, such as an application to Poisson processes.
Details of MCMC provides formal and detailed discussion on stochastic processes in general, Markov chains, and Metropolis-Hastings MCMC.
Markov Chain Monte Carlo in practice looks at MCMC in practice, with convergence tests and other diagnostics.
Advanced sampling algorithms introduces a handful of useful and more advanced sampling algorithms such as Hamiltonian Monte Carlo (HMC), ensamble and slice sampling, parallel tempering, importance resampling, and nested sampling.
State-of-the-art MCMC implementations provides several demo notebooks for selected state-of-the-art implementations of advanced sampling algorithms.