liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (3 of 3) Show all publications
Svensson, A., Lindsten, F. & Schön, T. B. (2018). Learning nonlinear state-space models using smooth particle-filter-based likelihood approximations. In: 18th IFAC Symposium on System IdentificationSYSID 2018 Proceedings: . Paper presented at SYSID 2018, July 9–11, Stockholm, Sweden (pp. 652-657). Elsevier
Open this publication in new window or tab >>Learning nonlinear state-space models using smooth particle-filter-based likelihood approximations
2018 (English)In: 18th IFAC Symposium on System IdentificationSYSID 2018 Proceedings, Elsevier, 2018, p. 652-657Conference paper, Published paper (Refereed)
Abstract [en]

When classical particle filtering algorithms are used for maximum likelihood parameter estimation in nonlinear state-space models, a key challenge is that estimates of the likelihood function and its derivatives are inherently noisy. The key idea in this paper is to run a particle filter based on a current parameter estimate, but then use the output from this particle filter to re-evaluate the likelihood function approximation also for other parameter values. This results in a (local) deterministic approximation of the likelihood and any standard optimization routine can be applied to find the maximum of this approximation. By iterating this procedure we eventually arrive at a final parameter estimate.

Place, publisher, year, edition, pages
Elsevier, 2018
Series
IFAC-PapersOnLine, ISSN 2405-8963 ; 51:15
National Category
Signal Processing Control Engineering
Identifiers
urn:nbn:se:liu:diva-159807 (URN)10.1016/j.ifacol.2018.09.216 (DOI)000446599200111 ()
Conference
SYSID 2018, July 9–11, Stockholm, Sweden
Funder
Swedish Foundation for Strategic Research , RT15-0012, ICA16-0015Swedish Research Council, 621-2016-06079, 2016-04278
Available from: 2018-10-08 Created: 2019-08-22 Last updated: 2019-08-23
Svensson, A., Schön, T. B. & Lindsten, F. (2018). Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution. Mechanical systems and signal processing, 104, 915-928
Open this publication in new window or tab >>Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution
2018 (English)In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 104, p. 915-928Article in journal (Refereed) Published
Abstract [en]

Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p(datalparameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.

Place, publisher, year, edition, pages
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD, 2018
Keywords
Probabilistic modelling, Bayesian methods, Nonlinear system identification, Sequential Monte Carlo, Particle filter, Approximate Bayesian computations, Highly informative observations, Tempering, Wiener-Hammerstein model
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-159808 (URN)10.1016/j.ymssp.2017.09.016 (DOI)000423652800057 ()
Funder
Swedish Research Council, 621-2013-5524, 2016-04278, 621-2016-06079Swedish Foundation for Strategic Research , RIT15-0012
Available from: 2018-05-15 Created: 2019-08-22 Last updated: 2019-08-23
Schön, T. B., Svensson, A., Murray, L. & Lindsten, F. (2018). Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo. Mechanical systems and signal processing, 104, 866-883
Open this publication in new window or tab >>Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
2018 (English)In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 104, p. 866-883Article in journal (Refereed) Published
Abstract [en]

Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods the particle Metropolis-Hastings algorithm which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods including particle Metropolis-Hastings to a large group of users without requiring them to know all the underlying mathematical details.

Place, publisher, year, edition, pages
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD, 2018
Keywords
Probabilistic modeling, Nonlinear dynamical systems, System identification, Parameter estimation, Bayesian methods, Metropolis-Hastings, Sequential Monte Carlo, Particle filter
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-159809 (URN)10.1016/j.ymssp.2017.10.033 (DOI)000423652800054 ()
Funder
Swedish Research Council, 621-2013-5524, 201604278, 621-2016-06079Swedish Foundation for Strategic Research , RIT15-0012
Available from: 2018-05-14 Created: 2019-08-22 Last updated: 2019-08-23
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-5601-1687

Search in DiVA

Show all publications