Conditional entropy. Let A and B = (Bo, B₁,..., Bn) be a discrete random variable and vector, respectively. The conditional entropy of A with respect to B is defined as H(A | B) = E (E{-log f(A | B) | B}) where f(a | b) = P(A = a | B = b). Let X be an aperiodic Markov chain on a finite state space. Show that and that H(Xn+1 | X0, X₁. , Xn) = H(Xn+1 | Xn), .... H(Xn+1 | Xn) →→лi Σ Pij i j if X is aperiodic with a unique stationary distribution . log Pij as n →∞,

Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.5: Markov Chain
Problem 55E
icon
Related questions
Question
Conditional entropy. Let A and B = (Bo, B₁,..., Bn) be a discrete random variable and
vector, respectively. The conditional entropy of A with respect to B is defined as H(A | B) =
E(E{-log f(A | B) | B}) where f(a | b) = P(A = a | B = b). Let X be an aperiodic Markov chain
on a finite state space. Show that
H(Xn+1 | X0, X₁. , Xn) = H(Xn+1 | Xn),
and that
H(Xn+1 \ Xn) → - ΣπιΣ pi;
i
j
if X is aperiodic with a unique stationary distribution .
log Pij
as n →∞,
Transcribed Image Text:Conditional entropy. Let A and B = (Bo, B₁,..., Bn) be a discrete random variable and vector, respectively. The conditional entropy of A with respect to B is defined as H(A | B) = E(E{-log f(A | B) | B}) where f(a | b) = P(A = a | B = b). Let X be an aperiodic Markov chain on a finite state space. Show that H(Xn+1 | X0, X₁. , Xn) = H(Xn+1 | Xn), and that H(Xn+1 \ Xn) → - ΣπιΣ pi; i j if X is aperiodic with a unique stationary distribution . log Pij as n →∞,
Expert Solution
steps

Step by step

Solved in 2 steps with 2 images

Blurred answer