My Math Forum  

Go Back   My Math Forum > College Math Forum > Advanced Statistics

Advanced Statistics Advanced Probability and Statistics Math Forum


Reply
 
LinkBack Thread Tools Display Modes
July 22nd, 2009, 05:42 PM   #1
Newbie
 
Joined: Jul 2009

Posts: 1
Thanks: 0

Markov Chains (Transition Matrix)

I am lost with this question and would appreciate any help!

Quote:
There are six balls; three are red, and three are blue. There are two urns, which we will call urn 1 and urn 2. Three balls (of no particular colour) are in urn 1, and the other three are in urn 2. Every minute, one ball is removed at random from each urn and is placed in the other urn. If we know say the number of red balls in urn 1 then from this we can easily find out everything else. For example if we have two red balls in urn 1, then there must be one blue ball in Urn 1, one red ball in urn 2, and two blue balls in urn 2. Hence, we can define the states as representing the number of red balls in Urn 1, i.e. 0, 1, 2, or 3, where the state number is one higher than the number of red balls (e.g. state 1 means no red balls in urn 1, etc.).
(a) Suppose that we are in state 1. What state must we move to on the next transition?
(b) Suppose that we are in state 2. (i) What is the probability of moving to state 1? (i) What is the
probability of staying in state 2? (Hint: there are two ways for this to happen) (iii) What is the
probability of moving to state 3?
(c) Analyze what happens if the system is currently in (i) state 3 (ii) state 4.
(d) Using the information from parts (a), (b), and (c), draw the state transition diagram.
(e) Based on (d), write the state transition matrix.
(f) By hand or by using Excel, calculate the long-run (i.e steady-state) probabilities of being in each
state.
dani87 is offline  
 
July 24th, 2009, 09:00 PM   #2
Newbie
 
Joined: Jul 2009

Posts: 23
Thanks: 0

Re: Markov Chains (Transition Matrix)

let Xn be the marcov chain
a) in state 1, we have no red balls in urn 1 and we want know Pr{X1=i|X0=1}=1. it is a certain event and we wish to determine i.
since each minute there has to be 1 ball from each urn gets moved to the other urn.
and there is no red balls in urn 1. which means that on the next move for sure 1 red ball will be moved in urn 1 and one blue ball will be moved out of urn 1
which means we will be in state 2 with probability 1.
b) if we are in state 2, Pr{X1=1|X0=2} is what we want. if the balls are chosen at random in each urn, meaning each ball will be chosen with 1/3 probability.
The probability we will return to state 1 is Pr{red ball gets chosen in urn 1|X0=2}*Pr{blue ball gets chosen in urn 2|X0=2}. They are multiplied since these two events are independent. and that is (1/3)(1/3)= 1/9. which is exactly Pr{X1=1|X0=2}=1/9.
if we want to stay in state 2, which is Pr{X1=2|X0=2}. we have 2 cases, either the balls chosen in each urn are both red or both blue (i.e. it makes no real change to the number of colored balls in each urn) that is Pr{red ball gets chosen in urn 1|X0=2}*Pr{red ball gets chosen in urn 2|X0=2}+Pr{blue ball gets chosen in urn 1|X0=2}*Pr{blue ball gets chosen in urn 2|X0=2}. I can't go in much details about when to use multiplication when to use addition, I will assume you know it well (and you should). and based on the information we know about X0=2 we know the probability is (1/3)(2/3)+(2/3)(1/3)= 4/9=Pr{X1=2|X0=2}
you can figure out Pr{X1=3|X0=2} on your own if you understand what is going on above.
c) use the same reasoning find all possible transition probabilities.
d) e) just apply all the information
f) this is asking you to find the limiting distribution of each state. just apply the formula
let Pi(i) be the limiting distribution of state i
Pi(i)= sum(all j) Pi(j)Pji where Pji being the transition probability from state j to state i
and the constrant sum(all Pi(i))=1
llsyes is offline  
Reply

  My Math Forum > College Math Forum > Advanced Statistics

Tags
chains, markov, matrix, transition



Search tags for this page
Click on a term to search for related topics.
Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Markov Chain (transition matrix) rmcf87 Advanced Statistics 3 January 21st, 2016 08:31 PM
markov chains lakshwee0292 Algebra 0 December 3rd, 2013 12:37 AM
Help about markov chains? aldors Advanced Statistics 1 November 17th, 2009 09:22 AM
Markov chains Katherine Advanced Statistics 2 May 27th, 2009 01:50 PM
markov chains aptx4869 Advanced Statistics 0 April 30th, 2007 04:57 PM





Copyright © 2019 My Math Forum. All rights reserved.