Each week, we will be publishing layman’s abstracts of new articles from our prestigious portfolio of journals in statistics. The aim is to highlight the latest research to a broader audience in an accessible format.
Grenon‐Godbout, N. and Bédard, M. (2020), On‐line partitioning of the sample space in the regional adaptive algorithm. Can J Statistics. doi:10.1002/cjs.11562
Markov Chain Monte-Carlo (MCMC) methods are a class of algorithms for sampling observations from complex probability distributions. They are used in many applied settings ranging from genetics to climate modeling, and their properties have been heavily studied. Notably, theoretical results about the optimal tuning of these samplers have been obtained for specific probability distributions. This have led to the development of adaptive MCMC algorithms that learn, in some sense, their optimal tuning on the fly by dynamically updating their parameters during the sampling process.
More recent developments have introduced regional adaptive samplers, with the aim of improving efficiency on particularly challenging probability distributions. Their name derives from the fact that they use the spatial locations of the last sampled points in order to decide their next course of action. Two famous examples of such algorithms are the RAPT and RAPTOR. The first one relies on a fixed initial partition of the space, thus making it vulnerable to a sub-optimal choice of that partition. The other one features an adaptive update of the partition, making it much more robust to that initial choice, but also more computationally demanding.
The contribution of the paper is to propose a compromise between these two approaches. The idea is to retain the adaptive partitioning principle of the RAPTOR, but make it as fast as possible by limiting the shape of the frontiers to hyperplanes. This yields an algorithm that is only marginally slower than RAPT and more robust to initial conditions. This new feature is proven to preserve the desired theoretical properties of the algorithm. Extensive performance comparisons with RAPT and RAPTOR are reported. A real data example is also included, in which the new algorithm was able to reproduce the results of the other two with much less prior optimization.