
Advanced Statistics Advanced Probability and Statistics Math Forum 
 LinkBack  Thread Tools  Display Modes 
July 22nd, 2009, 05:42 PM  #1  
Newbie Joined: Jul 2009 Posts: 1 Thanks: 0  Markov Chains (Transition Matrix)
I am lost with this question and would appreciate any help! Quote:
 
July 24th, 2009, 09:00 PM  #2 
Newbie Joined: Jul 2009 Posts: 23 Thanks: 0  Re: Markov Chains (Transition Matrix)
let Xn be the marcov chain a) in state 1, we have no red balls in urn 1 and we want know Pr{X1=iX0=1}=1. it is a certain event and we wish to determine i. since each minute there has to be 1 ball from each urn gets moved to the other urn. and there is no red balls in urn 1. which means that on the next move for sure 1 red ball will be moved in urn 1 and one blue ball will be moved out of urn 1 which means we will be in state 2 with probability 1. b) if we are in state 2, Pr{X1=1X0=2} is what we want. if the balls are chosen at random in each urn, meaning each ball will be chosen with 1/3 probability. The probability we will return to state 1 is Pr{red ball gets chosen in urn 1X0=2}*Pr{blue ball gets chosen in urn 2X0=2}. They are multiplied since these two events are independent. and that is (1/3)(1/3)= 1/9. which is exactly Pr{X1=1X0=2}=1/9. if we want to stay in state 2, which is Pr{X1=2X0=2}. we have 2 cases, either the balls chosen in each urn are both red or both blue (i.e. it makes no real change to the number of colored balls in each urn) that is Pr{red ball gets chosen in urn 1X0=2}*Pr{red ball gets chosen in urn 2X0=2}+Pr{blue ball gets chosen in urn 1X0=2}*Pr{blue ball gets chosen in urn 2X0=2}. I can't go in much details about when to use multiplication when to use addition, I will assume you know it well (and you should). and based on the information we know about X0=2 we know the probability is (1/3)(2/3)+(2/3)(1/3)= 4/9=Pr{X1=2X0=2} you can figure out Pr{X1=3X0=2} on your own if you understand what is going on above. c) use the same reasoning find all possible transition probabilities. d) e) just apply all the information f) this is asking you to find the limiting distribution of each state. just apply the formula let Pi(i) be the limiting distribution of state i Pi(i)= sum(all j) Pi(j)Pji where Pji being the transition probability from state j to state i and the constrant sum(all Pi(i))=1 

Tags 
chains, markov, matrix, transition 
Search tags for this page 
Click on a term to search for related topics.

Thread Tools  
Display Modes  

Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Markov Chain (transition matrix)  rmcf87  Advanced Statistics  3  January 21st, 2016 08:31 PM 
markov chains  lakshwee0292  Algebra  0  December 3rd, 2013 12:37 AM 
Help about markov chains?  aldors  Advanced Statistics  1  November 17th, 2009 09:22 AM 
Markov chains  Katherine  Advanced Statistics  2  May 27th, 2009 01:50 PM 
markov chains  aptx4869  Advanced Statistics  0  April 30th, 2007 04:57 PM 