Semi-Implicit Variational Inference
University of Michigan School of Public Health
1690 SPH I, 1415 Washington Heights Ann Arbor, MI 48109-2029

Variational inference (VI) is an optimization based method for approximate Bayesian inference. In comparison to Markov chain Monte Carlo (MCMC) widely used in biomedical research, VI is often faster, easier to diagnose convergence, and more scalable to big data. However, it often considerably underestimates the variance of the posterior, making it inappropriate for applications requiring accurately estimating posterior uncertainty. In this talk, I will introduce semi-implicit VI (SIVI) that expands the commonly used analytic variational distribution family, by mixing the variational parameter with a flexible distribution. This mixing distribution can assume any density function, explicit or not, as long as independent random samples can be generated via reparameterization. Not only does SIVI expand the variational family to incorporate highly flexible variational distributions, including implicit ones that have no analytic density functions, such as those generated by propagating random noises through deep neural networks, but also sandwiches the evidence lower bound (ELBO) between a lower bound and an upper bound, and further derives an asymptotically exact surrogate ELBO that is amenable to optimization via stochastic gradient ascend. With a substantially expanded variational family and a novel optimization algorithm, SIVI is shown to closely match the accuracy of MCMC for posterior inference in a variety of Bayesian inference tasks that are highly relevant to biomedical research. Light refreshments for seminar guests will be served at 3:10 p.m. in 1690

Department of Biostatistics

Semi-Implicit Variational Inference

Mingyuan Zhou, Ph.D., Assistant Professor, Department of Statistics, University of Texas at Austin

icon to add this event to your google calendarMarch 15, 2018
3:30 PM - 5:00 PM
1690 SPH I
1415 Washington Heights
Ann Arbor, MI 48109-2029
Sponsored by: Department of Biostatistics
Contact Information: Zhenke WU (zhenkewu@umich.edu)

Variational inference (VI) is an optimization based method for approximate Bayesian inference. In comparison to Markov chain Monte Carlo (MCMC) widely used in biomedical research, VI is often faster, easier to diagnose convergence, and more scalable to big data. However, it often considerably underestimates the variance of the posterior, making it inappropriate for applications requiring accurately estimating posterior uncertainty. In this talk, I will introduce semi-implicit VI (SIVI) that expands the commonly used analytic variational distribution family, by mixing the variational parameter with a flexible distribution. This mixing distribution can assume any density function, explicit or not, as long as independent random samples can be generated via reparameterization. Not only does SIVI expand the variational family to incorporate highly flexible variational distributions, including implicit ones that have no analytic density functions, such as those generated by propagating random noises through deep neural networks, but also sandwiches the evidence lower bound (ELBO) between a lower bound and an upper bound, and further derives an asymptotically exact surrogate ELBO that is amenable to optimization via stochastic gradient ascend. With a substantially expanded variational family and a novel optimization algorithm, SIVI is shown to closely match the accuracy of MCMC for posterior inference in a variety of Bayesian inference tasks that are highly relevant to biomedical research. Light refreshments for seminar guests will be served at 3:10 p.m. in 1690