Analytical Approximations for Bayesian Inference
2015 (English)Doctoral thesis, comprehensive summary (Other academic)
Bayesian inference is a statistical inference technique in which Bayes’ theorem is used to update the probability distribution of a random variable using observations. Except for few simple cases, expression of such probability distributions using compact analytical expressions is infeasible. Approximation methods are required to express the a priori knowledge about a random variable in form of prior distributions. Further approximations are needed to compute posterior distributions of the random variables using the observations. When the computational complexity of representation of such posteriors increases over time as in mixture models, approximations are required to reduce the complexity of such representations.
This thesis further extends existing approximation methods for Bayesian inference, and generalizes the existing approximation methods in three aspects namely; prior selection, posterior evaluation given the observations and maintenance of computation complexity.
Particularly, the maximum entropy properties of the first-order stable spline kernel for identification of linear time-invariant stable and causal systems are shown. Analytical approximations are used to express the prior knowledge about the properties of the impulse response of a linear time-invariant stable and causal system.
Variational Bayes (VB) method is used to compute an approximate posterior in two inference problems. In the first problem, an approximate posterior for the state smoothing problem for linear statespace models with unknown and time-varying noise covariances is proposed. In the second problem, the VB method is used for approximate inference in state-space models with skewed measurement noise.
Moreover, a novel approximation method for Bayesian inference is proposed. The proposed Bayesian inference technique is based on Taylor series approximation of the logarithm of the likelihood function. The proposed approximation is devised for the case where the prior distribution belongs to the exponential family of distributions.
Finally, two contributions are dedicated to the mixture reduction (MR) problem. The first contribution, generalize the existing MR algorithms for Gaussian mixtures to the exponential family of distributions and compares them in an extended target tracking scenario. The second contribution, proposes a new Gaussian mixture reduction algorithm which minimizes the reverse Kullback-Leibler divergence and has specific peak preserving properties.
Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2015. , 79 p.
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1710
IdentifiersURN: urn:nbn:se:liu:diva-121619DOI: 10.3384/diss.diva-121619ISBN: 978-91-7685-930-8 (print)OAI: oai:DiVA.org:liu-121619DiVA: diva2:858322
2015-11-06, Visionen, B-huset, Campus Valla, Linköping, 10:15 (English)
Koch, Johan Wolfgang, Dr.
Gustafsson, Fredrik, Professor
List of papers