
Advanced Statistics Advanced Probability and Statistics Math Forum 
 LinkBack  Thread Tools  Display Modes 
April 30th, 2007, 04:57 PM  #1 
Member Joined: Mar 2007 Posts: 57 Thanks: 0  markov chains
Is the following true or false, and why? E_x { 1_y (X_n) } = P^n (x,y) "The expected value, given a markov chain started in state x at time 0, of the indicator function (i.e. 1_y (Z) = 1 if Z=y and 0 if otherwise) on X_n, the state that the markov chain is in at time n, is equal to the probability that the markov chain will get from state x to state y in n steps." 

Tags 
chains, markov 
Thread Tools  
Display Modes  

Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
markov chains  lakshwee0292  Algebra  0  December 3rd, 2013 12:37 AM 
Markov Chains  inequality  Algebra  11  November 26th, 2009 03:12 AM 
Help about markov chains?  aldors  Advanced Statistics  1  November 17th, 2009 09:22 AM 
Markov chains  Katherine  Advanced Statistics  2  May 27th, 2009 01:50 PM 
Help about markov chains?  aldors  Algebra  0  December 31st, 1969 04:00 PM 