
Advanced Statistics Advanced Probability and Statistics Math Forum 
 LinkBack  Thread Tools  Display Modes 
November 14th, 2009, 07:27 AM  #1 
Newbie Joined: Nov 2009 Posts: 1 Thanks: 0  Help about markov chains?
Each time unit a data multiplexer receives a packet with probability a, and/or transmits a packet from its buer with probability b. Assume that the multi plexer can hold at most N packets. This means that, if there are already N packets in the queue, then no new packet can arrive. Let Xn be the number of packets in the multiplexer at time n. (a) Show that the system can be modeled by a Markov Chain. (b) Find the transition probability matrix P. (c) Find the stationary probability density function. Can you help me with point (C). I don't understand the explanations in my textbook. And what do you think is it a 2 or 3 state markov chain. 
November 17th, 2009, 09:22 AM  #2 
Member Joined: Oct 2009 Posts: 64 Thanks: 0  Re: Help about markov chains?
It's an N+1 state chain, looking at the number of possible states for the multiplexer, no? For a probability distribution to be stationary, it must be unchanged by the transition matrix P. So if u is the stationary probability distribution, we must have Pu = u. Use that to determine the relative probabilities for each state. 

Tags 
chains, markov 
Thread Tools  
Display Modes  

Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
markov chains  lakshwee0292  Algebra  0  December 3rd, 2013 12:37 AM 
Markov Chains  inequality  Algebra  11  November 26th, 2009 03:12 AM 
Markov chains  Katherine  Advanced Statistics  2  May 27th, 2009 01:50 PM 
markov chains  aptx4869  Advanced Statistics  0  April 30th, 2007 04:57 PM 
Help about markov chains?  aldors  Algebra  0  December 31st, 1969 04:00 PM 