liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Ancestor Sampling for Particle Gibbs
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
University of California, Berkeley.
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
2012 (English)In: Proceedings of the 26th Conference on Neural Information Processing Systems, 2012Conference paper, Published paper (Refereed)
Abstract [en]

We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS). Similarly to the existing PG with backward simulation (PG-BS) procedure, we use backward sampling to (considerably) improve the mixing of the PG kernel. Instead of using separate forward and backward sweeps as in PG-BS, however, we achieve the same effect in a single forward sweep. We apply the PG-AS framework to the challenging class of non-Markovian state-space models. We develop a truncation strategy of these models that is applicable in principle to any backward-simulation-based method, but which is particularly well suited to the PG-AS framework. In particular, as we show in a simulation study, PG-AS can yield an order-of-magnitude improved accuracy relative to PG-BS due to its robustness to the truncation error. Several application examples are discussed, including Rao-Blackwellized particle smoothing and inference in degenerate state-space models.

Place, publisher, year, edition, pages
2012.
Keyword [en]
Particle Gibbs, Sampling
National Category
Probability Theory and Statistics Control Engineering
Identifiers
URN: urn:nbn:se:liu:diva-88610ISBN: 9781627480031 (print)OAI: oai:DiVA.org:liu-88610DiVA: diva2:605133
Conference
26th Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 3-6 December, 2012
Projects
CADICSCNDM
Funder
Swedish Research Council
Available from: 2013-02-13 Created: 2013-02-13 Last updated: 2013-10-08
In thesis
1. Particle filters and Markov chains for learning of dynamical systems
Open this publication in new window or tab >>Particle filters and Markov chains for learning of dynamical systems
2013 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) methods provide computational tools for systematic inference and learning in complex dynamical systems, such as nonlinear and non-Gaussian state-space models. This thesis builds upon several methodological advances within these classes of Monte Carlo methods.Particular emphasis is placed on the combination of SMC and MCMC in so called particle MCMC algorithms. These algorithms rely on SMC for generating samples from the often highly autocorrelated state-trajectory. A specific particle MCMC algorithm, referred to as particle Gibbs with ancestor sampling (PGAS), is suggested. By making use of backward sampling ideas, albeit implemented in a forward-only fashion, PGAS enjoys good mixing even when using seemingly few particles in the underlying SMC sampler. This results in a computationally competitive particle MCMC algorithm. As illustrated in this thesis, PGAS is a useful tool for both Bayesian and frequentistic parameter inference as well as for state smoothing. The PGAS sampler is successfully applied to the classical problem of Wiener system identification, and it is also used for inference in the challenging class of non-Markovian latent variable models.Many nonlinear models encountered in practice contain some tractable substructure. As a second problem considered in this thesis, we develop Monte Carlo methods capable of exploiting such substructures to obtain more accurate estimators than what is provided otherwise. For the filtering problem, this can be done by using the well known Rao-Blackwellized particle filter (RBPF). The RBPF is analysed in terms of asymptotic variance, resulting in an expression for the performance gain offered by Rao-Blackwellization. Furthermore, a Rao-Blackwellized particle smoother is derived, capable of addressing the smoothing problem in so called mixed linear/nonlinear state-space models. The idea of Rao-Blackwellization is also used to develop an online algorithm for Bayesian parameter inference in nonlinear state-space models with affine parameter dependencies.

Place, publisher, year, edition, pages
Linköping University Electronic Press, 2013. 42 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1530
Keyword
Bayesian learning, System identification, Sequential Monte Carlo, Markov chain Monte Carlo, Particle MCMC, Particle filters, Particle smoothers
National Category
Control Engineering Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-97692 (URN)10.3384/diss.diva-97692 (DOI)978-91-7519-559-9 (ISBN)
Public defence
2013-10-25, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Opponent
Supervisors
Projects
CNDMCADICS
Funder
Swedish Research Council
Available from: 2013-10-08 Created: 2013-09-19 Last updated: 2013-10-08Bibliographically approved

Open Access in DiVA

No full text

Authority records BETA

Lindsten, FredrikSchön, Thomas

Search in DiVA

By author/editor
Lindsten, FredrikSchön, Thomas
By organisation
Automatic ControlThe Institute of Technology
Probability Theory and StatisticsControl Engineering

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 3248 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf