Adaptively Optimised Adaptive Importance Samplers
Preprint (submitted for publication), 2023
Recommended citation: C. C. Perello, C. A., Akyildiz, Ö. D. (2023). Adaptively Optimised Adaptive Importance Samplers. https://arxiv.org/abs/2307.09341
We introduce a new class of adaptive importance samplers leveraging adaptive optimisation tools, which we term AdaOAIS. We build on Optimised Adaptive Importance Samplers (OAIS), a class of techniques that adapt proposals to improve the mean-squared error of the importance sampling estimators by parameterising the proposal and optimising the $\chi^2$-divergence between the target and the proposal. We show that a naive implementation of OAIS using stochastic gradient descent may lead to unstable estimators despite its convergence guarantees. To remedy this shortcoming, we instead propose to use adaptive optimisers (such as AdaGrad and Adam) to improve the stability of the OAIS. We provide convergence results for AdaOAIS in a similar manner to OAIS. We also provide empirical demonstration on a variety of examples and show that AdaOAIS lead to stable importance sampling estimators in practice.
Here are some slides I made when presenting this as my BSc project.
Fun fact: One of the figures generated (but not shown) in the paper is this website’s favicon!