Saturday, January 11, 2014
Markov subchains
A subchain of a Markov chain is also a Marchov chain. How to prove this ? Here is an idea. Suppose there is a Marchov subchain from Xi to Xn
⋯→Xi→Xj→⋯→Xn→⋯ |
p(x1,⋯,xi,xj,⋯,xn,⋯)=p(x1,x2)p(x3|x2)⋯p(xj|xi)⋯p(xn|xn−1)⋯=p(x1,x2)p(x2,x3)p(x2)⋯p(xi,xj)p(xi)⋯p(xn,xn−1)p(xn−1)⋯ |
p(x1,⋯,xi,xn,⋯)=p(x1,x2)p(x2,x3)p(x2)⋯p(xi,xn)p(xi)⋯ |
⋯→Xi→Xn→⋯ |
Formally, Let Nn={1,2,...,n} and let X1→X2→⋯→Xn form a Markov subchain. For any subset α of Nn, denote Xi,i∈α by Xα. Then for any disjoint subsets α1,α2,⋯,αn of Nn such that
k1<k2<⋯<km |
Xα1→Xα2→⋯→Xαm |
Source: Information Theory.
Comments:
<< Home
rot13,
Thanks for making your proof available.
Thw source says INFORMATION THEORY.
Is that the name of a book, or ? ?
Post a Comment
Thanks for making your proof available.
Thw source says INFORMATION THEORY.
Is that the name of a book, or ? ?
<< Home