liu.seSök publikationer i DiVA
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Sequential Monte Carlo for Graphical Models
Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
University of Cambridge, Cambridge, UK.
Uppsala University, Uppsala, Sweden.
2014 (Engelska)Ingår i: Advances in Neural Information Processing Systems, 2014, s. 1862-1870Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM). Via a sequential decomposition of the PGM we find a sequence of auxiliary distributions defined on a monotonically increasing sequence of probability spaces. By targeting these auxiliary distributions using SMC we are able to approximate the full joint distribution defined by the PGM. One of the key merits of the SMC sampler is that it provides an unbiased estimate of the partition function of the model. We also show how it can be used within a particle Markov chain Monte Carlo framework in order to construct high-dimensional block-sampling algorithms for general PGMs.

Ort, förlag, år, upplaga, sidor
2014. s. 1862-1870
Nationell ämneskategori
Datavetenskap (datalogi) Sannolikhetsteori och statistik Reglerteknik
Identifikatorer
URN: urn:nbn:se:liu:diva-112967OAI: oai:DiVA.org:liu-112967DiVA, id: diva2:775992
Konferens
Neural Information Processing Systems (NIPS)
Tillgänglig från: 2015-01-06 Skapad: 2015-01-06 Senast uppdaterad: 2018-11-09Bibliografiskt granskad
Ingår i avhandling
1. Machine learning using approximate inference: Variational and sequential Monte Carlo methods
Öppna denna publikation i ny flik eller fönster >>Machine learning using approximate inference: Variational and sequential Monte Carlo methods
2018 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubiquitous in our everyday life. The systems we design, and technology we develop, requires us to coherently represent and work with uncertainty in data. Probabilistic models and probabilistic inference gives us a powerful framework for solving this problem. Using this framework, while enticing, results in difficult-to-compute integrals and probabilities when conditioning on the observed data. This means we have a need for approximate inference, methods that solves the problem approximately using a systematic approach. In this thesis we develop new methods for efficient approximate inference in probabilistic models.

There are generally two approaches to approximate inference, variational methods and Monte Carlo methods. In Monte Carlo methods we use a large number of random samples to approximate the integral of interest. With variational methods, on the other hand, we turn the integration problem into that of an optimization problem. We develop algorithms of both types and bridge the gap between them.

First, we present a self-contained tutorial to the popular sequential Monte Carlo (SMC) class of methods. Next, we propose new algorithms and applications based on SMC for approximate inference in probabilistic graphical models. We derive nested sequential Monte Carlo, a new algorithm particularly well suited for inference in a large class of high-dimensional probabilistic models. Then, inspired by similar ideas we derive interacting particle Markov chain Monte Carlo to make use of parallelization to speed up approximate inference for universal probabilistic programming languages. After that, we show how we can make use of the rejection sampling process when generating gamma distributed random variables to speed up variational inference. Finally, we bridge the gap between SMC and variational methods by developing variational sequential Monte Carlo, a new flexible family of variational approximations.

Ort, förlag, år, upplaga, sidor
Linköping: Linköping University Electronic Press, 2018. s. 39
Serie
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1969
Nationell ämneskategori
Reglerteknik Datavetenskap (datalogi) Signalbehandling
Identifikatorer
urn:nbn:se:liu:diva-152647 (URN)10.3384/diss.diva-152647 (DOI)9789176851616 (ISBN)
Disputation
2018-12-14, Ada Lovelace, Building B, Campus Valla, Linköping, 10:15 (Engelska)
Opponent
Handledare
Tillgänglig från: 2018-11-27 Skapad: 2018-11-09 Senast uppdaterad: 2019-09-26Bibliografiskt granskad

Open Access i DiVA

fulltext(428 kB)390 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 428 kBChecksumma SHA-512
5dfc410aaae39fdc7d028d8c471e4cb84a41f1d741d790918464e4294b43839254b9d21212d544f2217ade0aa00e59565a786a1400244df83de7880d048719dc
Typ fulltextMimetyp application/pdf

Övriga länkar

link to artikel

Person

Andersson Naesseth, ChristianLindsten, FredrikSchön, Thomas

Sök vidare i DiVA

Av författaren/redaktören
Andersson Naesseth, ChristianLindsten, FredrikSchön, Thomas
Av organisationen
ReglerteknikTekniska högskolan
Datavetenskap (datalogi)Sannolikhetsteori och statistikReglerteknik

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 390 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 627 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf