Bayesian Sampling using Stochastic Gradient Thermostats
Venue
Advances in Neural Information Processing Systems (2014), pp. 3203-3211
Publication Year
2014
Authors
Nan Ding, Youhan Fang, Ryan Babbush, Changyou Chen, Robert Skeel, Hartmut Neven
BibTeX
Abstract
Dynamics-based sampling methods, such as Hybrid Monte Carlo (HMC) and Langevin
dynamics (LD), are commonly used to sample target distributions. Recently, such
approaches have been combined with stochastic gradient techniques to increase
sampling efficiency when dealing with large datasets. An outstanding problem with
this approach is that the stochastic gradient introduces an unknown amount of noise
which can prevent proper sampling after discretization. To remedy this problem, we
show that one can leverage a small number of additional variables to stabilize
momentum fluctuations induced by the unknown noise. Our method is inspired by the
idea of a thermostat in statistical physics and is justified by a general theory.
