Weisser Zwerg Logo

Weisser Zwerg

Series: Monte Carlo Methods

Published by Weisser Zwerg Blog on

A series about Monte Carlo methods and generating samples from probability distributions.

This is a series about Monte Carlo methods and sampling from statistical models. For a long time I avoided “stepping down” into the “cellar” of Bayesian statistics and tried to stay at high level tools like Stan or PyMC3 until I realized that (too) often I needed a finer grained approach.

The purpose of this blog post series is to give you a working understanding of fine-grained composable abstractions (FCA) that you can use to build adapted solutions for your problems. Along the way you might also develop a deeper understanding of statistical modelling. At least I benefited a lot from the deeper understanding you gain from implementing the whole process end-to-end.

In principle there are only 2 core building blocks for Monte Carlo simulation:

  • Importance Sampling and its variations / synonyms like Sequential Importance Sampling (SIS), Sequential Monte Carlo (SMC), Particle Filters, …
  • Markov chain Monte Carlo (MCMC) and its variations like Metropolis-Hastings (MH), Hamiltonian (or Hybrid) Monte Carlo (HMC, NUTS), Gibbs sampling, …

Blog Posts

Further Reading

For a general overview of the Bayesian approach to statistics I recommend the following books:

For a text book with a clear and comprehensive presentation of Monte Carlo / simulation methods I’d suggest to start with:

The following two books are more specialized on Importance Sampling / Sequential Monte Carlo and Markov Chain Monte Carlo respectively:

I also regularly encounter pointers to the following book, but did not read it myself yet:

I put the publishing year in front of the above references, because this is a fast-moving field and while the underlying core principles remain the same the state-of-the-art is evolving.

Feedback

Have you written a response to this? Let me know the URL via telegraph.

No mentions yet.