Google
 
Web unafbapune.blogspot.com

Friday, January 10, 2014

 

Proposition 2.7 Markov Chain

For random variables \(X_1, X_2, \cdots , X_n\), where \(n \ge 3, X_1 \rightarrow X_2 \rightarrow \cdots \rightarrow X_n\) forms a Markov chain if
  \[ \begin{aligned} p(x_1, x_2, \cdots, x_n) &= \\ & \begin{cases} p(x_1,x_2)p(x_3|\,x_2)\cdots p(x_n|\,x_{n-1}) & \text{if } p(x_2), p(x_3), \cdots, p(x_{n-1}) > 0 \\ 0 & \text{otherwise}. \end{cases} \end{aligned} \]
Show that \(X_1 \rightarrow X_2 \rightarrow \cdots \rightarrow X_n\) forms a Markov Chain if and only if \(X_n \rightarrow X_{n-1} \rightarrow \cdots \rightarrow X_1\) forms a Markov Chain.

By definition,
  \[ \begin{aligned} p(a |\, b) &= {p(a, b) \over p(b) } \\ \end{aligned} \]
Furthermore, if \(X_1 \rightarrow X_2 \rightarrow \cdots \rightarrow X_n\) forms a Markov Chain,
  \[ \begin{aligned} p(x_1, x_2, \cdots, x_n) &= p(x_1,x_2)p(x_3|\,x_2)\cdots p(x_{n-1}|\,x_{n-2}) p(x_n|\,x_{n-1}) \\ &= p(x_1,x_2){p(x_3, x_2) \over p(x_2)} \cdots {p(x_{n-1},x_{n-2}) \over p(x_{n-2})} {p(x_n,x_{n-1}) \over p(x_{n-1})} \\ &= {p(x_1,x_2) \over p(x_2)} {p(x_3, x_2) \over p(x_3)} \cdots {p(x_{n-2,}x_{n-1}) \over p(x_{n-1})} p(x_n,x_{n-1}) \\ &= p(x_n,x_{n-1})p(x_{n-2}|\,x_{n-1})\cdots p(x_2|\,x_3) p(x_1|\,x_2) \\ \end{aligned} \]
\(\therefore X_n \rightarrow X_{n-1} \rightarrow \cdots \rightarrow X_1\) forms a Markov Chain.

\(\Box\)

Source: Information Theory.


Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?