liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
BETA
Schön, Thomas B., ProfessorORCID iD iconorcid.org/0000-0001-5183-234X
Publications (5 of 5) Show all publications
Svensson, A., Lindsten, F. & Schön, T. B. (2018). Learning nonlinear state-space models using smooth particle-filter-based likelihood approximations. In: 18th IFAC Symposium on System IdentificationSYSID 2018 Proceedings: . Paper presented at SYSID 2018, July 9–11, Stockholm, Sweden (pp. 652-657). Elsevier
Open this publication in new window or tab >>Learning nonlinear state-space models using smooth particle-filter-based likelihood approximations
2018 (English)In: 18th IFAC Symposium on System IdentificationSYSID 2018 Proceedings, Elsevier, 2018, p. 652-657Conference paper, Published paper (Refereed)
Abstract [en]

When classical particle filtering algorithms are used for maximum likelihood parameter estimation in nonlinear state-space models, a key challenge is that estimates of the likelihood function and its derivatives are inherently noisy. The key idea in this paper is to run a particle filter based on a current parameter estimate, but then use the output from this particle filter to re-evaluate the likelihood function approximation also for other parameter values. This results in a (local) deterministic approximation of the likelihood and any standard optimization routine can be applied to find the maximum of this approximation. By iterating this procedure we eventually arrive at a final parameter estimate.

Place, publisher, year, edition, pages
Elsevier, 2018
Series
IFAC-PapersOnLine, ISSN 2405-8963 ; 51:15
National Category
Signal Processing Control Engineering
Identifiers
urn:nbn:se:liu:diva-159807 (URN)10.1016/j.ifacol.2018.09.216 (DOI)000446599200111 ()
Conference
SYSID 2018, July 9–11, Stockholm, Sweden
Funder
Swedish Foundation for Strategic Research , RT15-0012, ICA16-0015Swedish Research Council, 621-2016-06079, 2016-04278
Available from: 2018-10-08 Created: 2019-08-22 Last updated: 2019-08-23
Svensson, A., Schön, T. B. & Lindsten, F. (2018). Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution. Mechanical systems and signal processing, 104, 915-928
Open this publication in new window or tab >>Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution
2018 (English)In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 104, p. 915-928Article in journal (Refereed) Published
Abstract [en]

Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p(datalparameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.

Place, publisher, year, edition, pages
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD, 2018
Keywords
Probabilistic modelling, Bayesian methods, Nonlinear system identification, Sequential Monte Carlo, Particle filter, Approximate Bayesian computations, Highly informative observations, Tempering, Wiener-Hammerstein model
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-159808 (URN)10.1016/j.ymssp.2017.09.016 (DOI)000423652800057 ()
Funder
Swedish Research Council, 621-2013-5524, 2016-04278, 621-2016-06079Swedish Foundation for Strategic Research , RIT15-0012
Available from: 2018-05-15 Created: 2019-08-22 Last updated: 2019-08-23
Schön, T. B., Svensson, A., Murray, L. & Lindsten, F. (2018). Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo. Mechanical systems and signal processing, 104, 866-883
Open this publication in new window or tab >>Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
2018 (English)In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 104, p. 866-883Article in journal (Refereed) Published
Abstract [en]

Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods the particle Metropolis-Hastings algorithm which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods including particle Metropolis-Hastings to a large group of users without requiring them to know all the underlying mathematical details.

Place, publisher, year, edition, pages
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD, 2018
Keywords
Probabilistic modeling, Nonlinear dynamical systems, System identification, Parameter estimation, Bayesian methods, Metropolis-Hastings, Sequential Monte Carlo, Particle filter
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-159809 (URN)10.1016/j.ymssp.2017.10.033 (DOI)000423652800054 ()
Funder
Swedish Research Council, 621-2013-5524, 201604278, 621-2016-06079Swedish Foundation for Strategic Research , RIT15-0012
Available from: 2018-05-14 Created: 2019-08-22 Last updated: 2019-08-23
Jacob, P., Lindsten, F. & Schön, T. B. (2018). Retracted article: Smoothing with Couplings of Conditional Particle Filters. Journal of the American Statistical Association
Open this publication in new window or tab >>Retracted article: Smoothing with Couplings of Conditional Particle Filters
2018 (English)In: Journal of the American Statistical Association, ISSN 0162-1459, E-ISSN 1537-274XArticle in journal (Refereed) Epub ahead of print
Abstract [en]

In state space models, smoothing refers to the task of estimating a latent stochastic process given noisy measurements related to the process. We propose an unbiased estimator of smoothing expectations. The lack-of-bias property has methodological benefits: independent estimators can be generated in parallel, and confidence intervals can be constructed from the central limit theorem to quantify the approximation error. To design unbiased estimators, we combine a generic debiasing technique for Markov chains, with a Markov chain Monte Carlo algorithm for smoothing. The resulting procedure is widely applicable and we show in numerical experiments that the removal of the bias comes at a manageable increase in variance. We establish the validity of the proposed estimators under mild assumptions. Numerical experiments are provided on toy models, including a setting of highly-informative observations, and for a realistic Lotka-Volterra model with an intractable transition density.

Place, publisher, year, edition, pages
Taylor & Francis, 2018
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-159811 (URN)10.1080/01621459.2018.1505625 (DOI)
Funder
Swedish Foundation for Strategic Research , RIT15-0012Swedish Research Council, 2016-04278Swedish Research Council, 621-2016-06079
Note

Statement of Retraction

We, the Authors, Editors, and Publishers of the Journal of the American Statistical Association, have retracted the following article:

P. E. Jacob, F. Lindsten, T. B. Schön. “Smoothing with Couplings of Conditional Particle Filters,” the Journal of the American Statistical Association. Published Online 6 August 2018. DOI: 10.1080/01621459.2018.1505625.

Following publication on the Latest Articles page of the journal's website, it came to light that there existed a bug in the code used to produce the numbers initially presented in the retracted version. The results themselves remain the same, and not a word will have been changed when the article publishes in final form. The final article will be the version of record in good standing.

We have been informed in our decision-making by our policy on publishing ethics and integrity and the COPE guidelines on retractions.

Available from: 2019-08-22 Created: 2019-08-22 Last updated: 2019-08-22
Lindsten, F., Bunch, P., Särkkä, S., Schön, T. B. & Godsill, S. J. (2016). Rao–Blackwellized particle smoothers for conditionally linear Gaussian models. IEEE Journal on Selected Topics in Signal Processing, 10(2), 353-365
Open this publication in new window or tab >>Rao–Blackwellized particle smoothers for conditionally linear Gaussian models
Show others...
2016 (English)In: IEEE Journal on Selected Topics in Signal Processing, ISSN 1932-4553, E-ISSN 1941-0484, Vol. 10, no 2, p. 353-365Article in journal (Refereed) Published
Abstract [en]

Sequential Monte Carlo (SMC) methods, such as the particle filter, are by now one of the standard computational techniques for addressing the filtering problem in general state-space models. However, many applications require post-processing of data offline. In such scenarios the smoothing problem-in which all the available data is used to compute state estimates-is of central interest. We consider the smoothing problem for a class of conditionally linear Gaussian models. We present a forward-backward-type Rao-Blackwellized particle smoother (RBPS) that is able to exploit the tractable substructure present in these models. Akin to the well known Rao-Blackwellized particle filter, the proposed RBPS marginalizes out a conditionally tractable subset of state variables, effectively making use of SMC only for the “intractable part” of the model. Compared to existing RBPS, two key features of the proposed method are: 1) it does not require structural approximations of the model, and 2) the aforementioned marginalization is done both in the forward direction and in the backward direction.

Place, publisher, year, edition, pages
IEEE, 2016
Keywords
—Monte Carlo methods, particle filters, particle smoothers, Rao-Blackwellization, backward sampling
National Category
Signal Processing Control Engineering
Identifiers
urn:nbn:se:liu:diva-159814 (URN)10.1109/JSTSP.2015.2506543 (DOI)000370957200011 ()
Funder
Swedish Research Council, 637-2014-466Swedish Research Council, 621-2013-5524
Available from: 2016-02-12 Created: 2019-08-22 Last updated: 2019-08-23
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-5183-234X

Search in DiVA

Show all publications