liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
Kaijser, Thomas
Publications (10 of 10) Show all publications
Kaijser, T. (2017). A contraction theorem for Markov chains on general state spaces. Revue Roumaine De Mathématiques Pures Et Appliquées, 62(2), 355-370
Open this publication in new window or tab >>A contraction theorem for Markov chains on general state spaces
2017 (English)In: Revue Roumaine De Mathématiques Pures Et Appliquées, ISSN 0035-3965, Vol. 62, no 2, p. 355-370Article in journal (Refereed) Published
Abstract [en]

Let {Xn, n = 0, 1, 2, ...} denote a Markov chain on a general state space and let f be a nonnegative function. The purpose of this paper is to present conditions which will imply that f(Xn) tends to 0 a.s., as n tends to infinity. As an application we obtain a result on synchronisation for random dynamical systems. At the end of the paper, we also present a result on convergence in distribution for Markov chains on general state spaces, thereby generalising a similar result for Markov chains on compact metric spaces.

AMS 2010 Subject Classification: 60J05, 60F15, 60F05.

Place, publisher, year, edition, pages
Bucharest, Romania: Editura Academiei Romane / Publishing House of the Romanian Academy, 2017
Keywords
functions of Markov chains, synchronisation, convergence in distribution, random dynamical systems, iterated function systems
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-142821 (URN)000412410500003 ()
Available from: 2017-11-06 Created: 2017-11-06 Last updated: 2019-01-24Bibliographically approved
Kaijser, T. (2017). A Note on the Rechargeable Polya Urn Scheme. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>A Note on the Rechargeable Polya Urn Scheme
2017 (English)Report (Other academic)
Abstract [en]

A very simple specific case of a Polya urn scheme is as follows. At each trial one draws a ball from an urn with balls of two different colours. Then, one looks at the ball, and returns the ball to the urn together with another ball of the same colour. Then one makes another draw. Et cetera. At the first draw there is one ball of each colour.

The rechargeable Polya urn scheme is essentially the same except that between each draw there is a fixed probability that the process starts over with two balls in the urn having different colours.

Now, for n = 1,2, ..., let B(n) and G(n) denote respectively the number of blue and yellow balls in the urn and let Y(n) denote the colour of the ball drawn at the nth draw. Further let Z(n) denote the probability distribution of (B(n),G(n)) given that we have observed Y(m), from m = 1 to m = n. In this note we prove that the sequence Z(1),Z(2),.... converges in distribution.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2017. p. 21
Series
LiTH-MAT-R, ISSN 0348-2960 ; 13
Keywords
Polya urn scheme, Hidden Markov Models, random systems with complete connections, asymptotic stability
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:liu:diva-141831 (URN)LiTH-MAT-R--2017/13--SE (ISRN)
Available from: 2017-10-09 Created: 2017-10-09 Last updated: 2018-03-16Bibliographically approved
Kaijser, T. (2016). On convergence in distribution of the Markov chain generated by the filter kernel induced by a fully dominated Hidden Markov Model. Dissertationes Mathematicae, 514
Open this publication in new window or tab >>On convergence in distribution of the Markov chain generated by the filter kernel induced by a fully dominated Hidden Markov Model
2016 (English)In: Dissertationes Mathematicae, ISSN 0012-3862, E-ISSN 1730-6310, Vol. 514Article in journal (Refereed) Published
Abstract [en]

Consider a Hidden Markov Model (HMM) such that both the state space and the observation space are complete, separable, metric spaces and for which both the transition probability function (tr.pr.f.) determining the hidden Markov chain of the HMM and the tr.pr.f. determining the observation sequence of the HMM have densities. Such HMMs are called fully dominated. In this paper we consider a subclass of fully dominated HMMs which we call regular. A fully dominated, regular HMM induces a tr.pr.f. on the set of probability density functions on the state space which we call the filter kernel induced by the HMM and which can be interpreted as the Markov kernel associated to the sequence of conditional state distributions. We show that if the underlying hidden Markov chain of the fully dominated, regular HMM is strongly ergodic and a certain coupling condition is fulfilled, then, in the limit, the distribution of the conditional distribution becomes independent of the initial distribution of the hidden Markov chain and, if also the hidden Markov chain is uniformly ergodic, then the distributions tend towards a limit distribution. In the last part of the paper, we present some more explicit conditions, implying that the coupling condition mentioned above is satisfied.

Place, publisher, year, edition, pages
POLISH ACAD SCIENCES INST MATHEMATICS-IMPAN, 2016
Keywords
hidden Markov models; filtering processes; Markov chains on nonlocally compact spaces; Kantorovich distance; barycenter
National Category
Mathematical Analysis
Identifiers
urn:nbn:se:liu:diva-129178 (URN)10.4064/dm739-9-2015 (DOI)000376340100001 ()
Available from: 2016-06-13 Created: 2016-06-13 Last updated: 2017-11-28
Kaijser, T. (2016). Stochastic Perturbations of Iterations of a Simple, Non-expanding, Nonperiodic, Piecewise Linear, Interval-map. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Stochastic Perturbations of Iterations of a Simple, Non-expanding, Nonperiodic, Piecewise Linear, Interval-map
2016 (English)Report (Other academic)
Abstract [en]

Let g(x) = x/2 + 17/30 (mod 1), let 𝝃i i = 1,2, … to be a sequence of independent, identically distributed random variables with uniform distribution on the interval [0,1/15], define gi(x) = g(x) + 𝝃i (mod 1) and for n = 1,2, …, define gn (x) = gn(gn-1(…(g1(x))…)). For x ϵ [0,1) let μn,x denote the distribution of gn(x). The purpose of this note is to show that there exists a unique probability measure μ, such that, for all x ϵ [0,1); μn,x tends to μ as n → ∞. This contradicts a claim by Lasota and Mackey from 1987 stating that the process has an asymptotic three-periodicity.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2016. p. 18
Series
LiTH-MAT-R, ISSN 0348-2960 ; 2016:05
Keywords
convergence of distributions, random dynamical systems, stochastic perturbations of iterations, nonexpanding interval maps
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-128167 (URN)LiTH-MAT-R--2016/05--SE (ISRN)
Available from: 2016-05-19 Created: 2016-05-19 Last updated: 2016-09-28Bibliographically approved
Kaijser, T. (2015). A Contraction Theorem for Markov Chains on General State Spaces. Linköping University Electronic Press
Open this publication in new window or tab >>A Contraction Theorem for Markov Chains on General State Spaces
2015 (English)Report (Other academic)
Abstract [en]

Let {X(n), n=0,1,2,...} denote a Markov chain on a general state space and let f be a nonnegative function. The purpose of this paper is to present conditions which will imply that f(X(n)) tends to 0 a.s., as n tends to infinity. As an application we obtain a result on "synchronisation for random dynamical systems". At the end of the paper we also present a result on  "convergence in distribution" for random dynamical system on complete, separable, metric spaces, a result, which is a generalisation of  a similar result for random dynamical systems on compact, metric spaces.

Place, publisher, year, edition, pages
Linköping University Electronic Press, 2015. p. 17
Series
LiTH-MAT-R, ISSN 0348-2960 ; 2015:11
Keywords
functions of Markov chains, synchronisation, convergence in distribution, random dynamical systems
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-121465 (URN)LiTH-MAT-R--2015/11--SE (ISRN)
Available from: 2015-09-22 Created: 2015-09-21 Last updated: 2016-11-24Bibliographically approved
Kaijser, T. (2013). Convergence in distribution for filtering processes associated to Hidden Markov Models with densities. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Convergence in distribution for filtering processes associated to Hidden Markov Models with densities
2013 (English)Report (Other academic)
Abstract [en]

A Hidden Markov Model generates two basic stochastic processes, a Markov chain, which is hidden, and an observation sequence. The filtering process of a Hidden Markov Model is, roughly speaking, the sequence of conditional distributions of the hidden Markov chain that is obtained as new observations are received.

It is well-known, that the filtering process itself, is also a Markov chain. A classical, theoretical problem is to find conditions which implies that the distributions of the filtering process converge towards a unique limit measure.

This problem goes back to a paper of D Blackwell for the case when the Markov chain takes its values in a finite set and it goes back to a paper of H Kunita for the case when the state space of the Markov chain is a compact Hausdor space.

Recently, due to work by F Kochmann, J Reeds, P Chigansky and R van Handel, a necessary and sucient condition for the convergence of the distributions of the filtering process has been found for the case when the state space is finite. This condition has since been generalised to the case when the state space is denumerable.

In this paper we generalise some of the previous results on convergence in distribution to the case when the Markov chain and the observation sequence of a Hidden Markov Model take their values in complete, separable, metric spaces; it has though been necessary to assume that both the transition probability function of the Markov chain and the transition probability function that generates the observation sequence have densities.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2013. p. 79
Series
LiTH-MAT-R, ISSN 0348-2960 ; 2013:05
Keywords
Hidden Markov Models, filtering processes, Markov chains on nonlocally compact spaces, convergence in distribution, barycenter, ergodicity, Kantorovich metric.
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-92590 (URN)LiTH-MAT-R--2013/05--SE (ISRN)
Available from: 2013-05-14 Created: 2013-05-14 Last updated: 2013-05-14Bibliographically approved
Kaijser, T. (2011). On Markov Chains Induced by Partitioned Transition Probability Matrices. ACTA MATHEMATICA SINICA-ENGLISH SERIES, 27(3), 441-476
Open this publication in new window or tab >>On Markov Chains Induced by Partitioned Transition Probability Matrices
2011 (English)In: ACTA MATHEMATICA SINICA-ENGLISH SERIES, ISSN 1439-8516, Vol. 27, no 3, p. 441-476Article in journal (Refereed) Published
Abstract [en]

Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P. Let K denote the set of probability vectors on S. With every partition M of P we can associate a transition probability function PM on K defined in such a way that if p is an element of K and M is an element of M are such that parallel to pM parallel to andgt; 0, then, with probability parallel to pM parallel to, the vector p is transferred to the vector PM/parallel to pM parallel to. Here parallel to . parallel to denotes the l(1)-norm. In this paper we investigate the convergence in distribution for Markov chains generated by transition probability functions induced by partitions of transition probability matrices. The main motivation for this investigation is the application of the convergence results obtained to filtering processes of partially observed Markov chains with denumerable state space.

Place, publisher, year, edition, pages
Springer Business Business Media, 2011
Keywords
Markov chains on nonlocally compact spaces, filtering processes, hidden Markov chains, Kantorovich metric, barycenter
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-66907 (URN)10.1007/s10114-010-9696-9 (DOI)000287769600003 ()
Available from: 2011-03-21 Created: 2011-03-21 Last updated: 2011-04-06
Kaijser, T. (2009). On Markov chains induced by partitioned transition probability matrices. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>On Markov chains induced by partitioned transition probability matrices
2009 (English)Report (Other academic)
Abstract [en]

Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P, then we call M a partition of P.

Let K denote the set of probability vectors on S. With every partition M of P we can associate a transition probability function P M on K defined in such a way that if pK and MM are such that ‖pM‖ > 0, then, with probability ‖pM‖, the vector p is transferred to the vector pM/‖pM‖. Here ‖·‖ denotes the l 1-norm.

In this paper we investigate the convergence in distribution for Markov chains generated by transition probability functions induced by partitions of transition probability matrices. The main motivation for this investigation is the application of the convergence results obtained to filtering processes of partially observed Markov chains with denumerable state space.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2009
Series
LiTH-MAT-R, ISSN 0348-2960 ; 2009:07
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-19841 (URN)LiTH-MAT-R-2009/07--SE (ISRN)
Available from: 2009-08-12 Created: 2009-08-12 Last updated: 2016-07-08Bibliographically approved
Kaijser, T. (2008). On the convergence in distribution of the sequence of conditional state distributions induced by a Hidden Markov chain with denumerable state space. Linköping: Linköpings universitet
Open this publication in new window or tab >>On the convergence in distribution of the sequence of conditional state distributions induced by a Hidden Markov chain with denumerable state space
2008 (English)Report (Other academic)
Abstract [en]

  

Place, publisher, year, edition, pages
Linköping: Linköpings universitet, 2008
Series
LiTH-MAT-R ; 3
Keywords
Tillämpad matematik
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-42461 (URN)LiTH-MAT-R-2008-03 (ISRN)64707 (Local ID)64707 (Archive number)64707 (OAI)
Available from: 2009-10-10 Created: 2009-10-10
Kaijser, T. (1978). A Note on a Theorem by Larry Shepp. Linköping University Electronic Press
Open this publication in new window or tab >>A Note on a Theorem by Larry Shepp
1978 (English)Report (Other academic)
Place, publisher, year, edition, pages
Linköping University Electronic Press, 1978. p. 4
Series
LiTH-MAT-R, ISSN 0348-2960 ; 1978:18
National Category
Mathematics
Identifiers
urn:nbn:se:liu:diva-112263 (URN)LiTH-MAT-R-78-18 (ISRN)
Available from: 2014-11-20 Created: 2014-11-20 Last updated: 2014-11-20
Organisations

Search in DiVA

Show all publications