. Suppose that a Markov chain with 3 states and with transition matrix P is in state 2 on the second observation. Which of the following expressions represents the probability that it will be in state 3 on the third observation? (A) the (2, 3) entry of P 3 (B) the (2, 3) entry of P 2 (C) the (3, 3) entry of P 2 (D) the (2, 2) entry of P 3 (E) the (2, 3) entry of P (F) the (3, 2) entry of P (G) the (3, 2) entry of P 3 (H) the (3, 2) entry of P 2

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 49E: Consider the Markov chain whose matrix of transition probabilities P is given in Example 7b. Show...
icon
Related questions
Question

. Suppose that a Markov chain with 3 states and with transition matrix P is in state 2 on
the second observation. Which of the following expressions represents the probability
that it will be in state 3 on the third observation?

(A) the (2, 3) entry of P 3 (B) the (2, 3) entry of P 2
(C) the (3, 3) entry of P 2 (D) the (2, 2) entry of P 3
(E) the (2, 3) entry of P (F) the (3, 2) entry of P
(G) the (3, 2) entry of P 3 (H) the (3, 2) entry of P 2

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer