Problem 2: (mean return time) Consider a Markov chain {X,} on states {0,1,2,3} with a tran- sition matrix 1 0.1 0.4 0.2 0.3 P = 0.2 0.2 0.5 0.1 0.3 0.3 0.4 1. Compute the limiting distribution (T0, T1, T2, T3) of this Markov Markov Chain%; 2. For each state i, compute (directly) m¡¡ - the average number of steps it takes to get back to i if started in i, and show that the relation m;; = 1/T; is true.

Linear Algebra: A Modern Introduction
4th Edition
ISBN:9781285463247
Author:David Poole
Publisher:David Poole
Chapter3: Matrices
Section3.7: Applications
Problem 16EQ
icon
Related questions
Question
Problem 2: (mean return time) Consider a Markov chain {X„} on states {0,1,2,3} with a tran-
sition matrix
1
0.1 0.4 0.2 0.3
P=
0.2 0.2 0.5 0.1
0.3 0.3 0.4
1. Compute the limiting distribution (T0, T1, T2, T3) of this Markov Markov Chain;
2. For each state i, compute (directly) m¡¡ - the average number of steps it takes to get back to i
if started in i, and show that the relation m¡¡ = 1/T; is true.
Transcribed Image Text:Problem 2: (mean return time) Consider a Markov chain {X„} on states {0,1,2,3} with a tran- sition matrix 1 0.1 0.4 0.2 0.3 P= 0.2 0.2 0.5 0.1 0.3 0.3 0.4 1. Compute the limiting distribution (T0, T1, T2, T3) of this Markov Markov Chain; 2. For each state i, compute (directly) m¡¡ - the average number of steps it takes to get back to i if started in i, and show that the relation m¡¡ = 1/T; is true.
Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 3 steps

Blurred answer
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
Linear Algebra: A Modern Introduction
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning