My Math Forum Expected value and markov chains
 User Name Remember Me? Password

 Advanced Statistics Advanced Probability and Statistics Math Forum

September 26th, 2013, 03:24 AM   #1
Newbie

Joined: Apr 2013

Posts: 7
Thanks: 0

Expected value and markov chains

I have a transition and initial state matrix -

[attachment=0:295kjxtl]Tmatrix.jpg[/attachment:295kjxtl]

Where the top row of the state matrix is the probability of seeing film A, and the bottom row the probability of seeing film B.

The question I have been given is - "Given that a couple initially see film B, what is the expected number of times that they will see film B together over the next 4 trips to the theater?" So rising the transition matrix to a power 'n' and multiplying it by the initial state will give me the probability of them seeing either film the 'nth' time they go to the theater. I wondering how to get the expected value out of that. Is it simply summing up the probabilities and multiplying that by 4?

Thanks in advance,
- deSitter
Attached Images
 Tmatrix.jpg (19.7 KB, 252 views)

 September 28th, 2013, 01:08 AM #2 Newbie   Joined: Apr 2013 Posts: 7 Thanks: 0 Re: Expected value and markov chains Anybody have any thoughts? I would've thought the expected value would be fairly simple to find out, as a binomial distribution is just n * p.

 Tags chains, expected, markov

Search tags for this page

,

### expected value of a markov chain

Click on a term to search for related topics.
 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post jeany lou Linear Algebra 1 August 11th, 2012 09:28 PM inequality Algebra 11 November 26th, 2009 03:12 AM aldors Advanced Statistics 1 November 17th, 2009 09:22 AM Katherine Advanced Statistics 2 May 27th, 2009 01:50 PM deSitter Algebra 0 December 31st, 1969 04:00 PM

 Contact - Home - Forums - Cryptocurrency Forum - Top

Copyright © 2019 My Math Forum. All rights reserved.