
Advanced Statistics Advanced Probability and Statistics Math Forum 
 LinkBack  Thread Tools  Display Modes 
September 26th, 2013, 03:24 AM  #1 
Newbie Joined: Apr 2013 Posts: 7 Thanks: 0  Expected value and markov chains
I have a transition and initial state matrix  [attachment=0:295kjxtl]Tmatrix.jpg[/attachment:295kjxtl] Where the top row of the state matrix is the probability of seeing film A, and the bottom row the probability of seeing film B. The question I have been given is  "Given that a couple initially see film B, what is the expected number of times that they will see film B together over the next 4 trips to the theater?" So rising the transition matrix to a power 'n' and multiplying it by the initial state will give me the probability of them seeing either film the 'nth' time they go to the theater. I wondering how to get the expected value out of that. Is it simply summing up the probabilities and multiplying that by 4? Thanks in advance,  deSitter 
September 28th, 2013, 01:08 AM  #2 
Newbie Joined: Apr 2013 Posts: 7 Thanks: 0  Re: Expected value and markov chains
Anybody have any thoughts? I would've thought the expected value would be fairly simple to find out, as a binomial distribution is just n * p.


Tags 
chains, expected, markov 
Search tags for this page 
Click on a term to search for related topics.

Thread Tools  
Display Modes  

Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
MARKOV CHAINS  jeany lou  Linear Algebra  1  August 11th, 2012 09:28 PM 
Markov Chains  inequality  Algebra  11  November 26th, 2009 03:12 AM 
Help about markov chains?  aldors  Advanced Statistics  1  November 17th, 2009 09:22 AM 
Markov chains  Katherine  Advanced Statistics  2  May 27th, 2009 01:50 PM 
Expected value and markov chains  deSitter  Algebra  0  December 31st, 1969 04:00 PM 