liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Parameter elimination in particle Gibbs sampling
Uppsala Univ, Sweden.
Uppsala Univ, Sweden.
Uber AI, CA USA.
Linköping University, Department of Computer and Information Science, The Division of Statistics and Machine Learning. Linköping University, Faculty of Science & Engineering.
2019 (English)In: ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), NEURAL INFORMATION PROCESSING SYSTEMS (NIPS) , 2019, Vol. 32Conference paper, Published paper (Refereed)
Abstract [en]

Bayesian inference in state-space models is challenging due to high-dimensional state trajectories. A viable approach is particle Markov chain Monte Carlo, combining MCMC and sequential Monte Carlo to form "exact approximations" to otherwise intractable MCMC methods. The performance of the approximation is limited to that of the exact method. We focus on particle Gibbs and particle Gibbs with ancestor sampling, improving their performance beyond that of the underlying Gibbs sampler (which they approximate) by marginalizing out one or more parameters. This is possible when the parameter prior is conjugate to the complete data likelihood. Marginalization yields a non-Markovian model for inference, but we show that, in contrast to the general case, this method still scales linearly in time. While marginalization can be cumbersome to implement, recent advances in probabilistic programming have enabled its automation. We demonstrate how the marginalized methods are viable as efficient inference backends in probabilistic programming, and demonstrate with examples in ecology and epidemiology.

Place, publisher, year, edition, pages
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS) , 2019. Vol. 32
Series
Advances in Neural Information Processing Systems, ISSN 1049-5258
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:liu:diva-167714ISI: 000535866900050OAI: oai:DiVA.org:liu-167714DiVA, id: diva2:1454681
Conference
33rd Conference on Neural Information Processing Systems (NeurIPS)
Note

Funding Agencies|Swedish Research CouncilSwedish Research Council [2016-04278, 2016-06079]; Swedish Foundation for Strategic Research (SSF)Swedish Foundation for Strategic Research [ICA16-0015, RIT15-0012]; Wallenberg AI, Autonomous Systems and Software Program (WASP) - Knut and Alice Wallenberg Foundation

Available from: 2020-07-20 Created: 2020-07-20 Last updated: 2020-07-20

Open Access in DiVA

No full text in DiVA

Search in DiVA

By author/editor
Lindsten, Fredrik
By organisation
The Division of Statistics and Machine LearningFaculty of Science & Engineering
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 78 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf