Saturday, January 11, 2014
Markov subchains
A subchain of a Markov chain is also a Marchov chain. How to prove this ? Here is an idea. Suppose there is a Marchov subchain from \(X_i\) to \(X_n\)
  | \[ \begin{aligned} \cdots \rightarrow X_i \rightarrow \color{blue}{X_j \rightarrow \cdots} \rightarrow X_n \rightarrow \cdots \\ \end{aligned} \] |
  | \[ \begin{aligned} p(x_1, \cdots, x_i, \color{blue}{x_j, \cdots,} x_n, \cdots) &= p(x_1,x_2) p(x_3|x_2) \cdots \color{blue}{p(x_j|x_i) \cdots p(x_n|x_{n-1})} \cdots \\ &= p(x_1,x_2) {p(x_2,x_3) \over p(x_2)} \cdots {\color{blue}{p(x_i,x_j)} \over p(x_i)} \color{blue}{\cdots {p(x_n,x_{n-1}) \over p(x_{n-1})}} \cdots \\ \end{aligned} \] |
  | \[ \begin{aligned} p(x_1, \cdots, \color{teal}{x_i, x_n}, \cdots) &= p(x_1,x_2) {p(x_2,x_3) \over p(x_2)} \cdots {\color{teal}{p(x_i, x_n)} \over p(x_i)} \cdots \\ \end{aligned} \] |
  | \[ \begin{aligned} \cdots \rightarrow X_i \rightarrow X_n \rightarrow \cdots \\ \end{aligned} \] |
Formally, Let \(\mathscr{N}_n = \{1,2,...,n\}\,\) and let \(X_1 \rightarrow X_2 \rightarrow \cdots \rightarrow X_n\) form a Markov subchain. For any subset \(\alpha\) of \(\mathcal{N}_n\), denote \(X_i, i\in \alpha\) by \(X_\alpha\). Then for any disjoint subsets \(\alpha_1, \alpha_2, \cdots, \alpha_n\) of \(\mathcal{N}_n\) such that
  | \[ \begin{aligned} k_1 < k_2 < \cdots < k_m \end{aligned} \] |
  | \[ \begin{aligned} X_{\alpha_1} \rightarrow X_{\alpha_2} \rightarrow \cdots \rightarrow X_{\alpha_m} \end{aligned} \] |
Source: Information Theory.
Comments:
<< Home
rot13,
Thanks for making your proof available.
Thw source says INFORMATION THEORY.
Is that the name of a book, or ? ?
Post a Comment
Thanks for making your proof available.
Thw source says INFORMATION THEORY.
Is that the name of a book, or ? ?
<< Home