My Math Forum  

Go Back   My Math Forum > High School Math Forum > Probability and Statistics

Probability and Statistics Basic Probability and Statistics Math Forum

LinkBack Thread Tools Display Modes
September 4th, 2019, 01:42 PM   #1
Joined: Sep 2019
From: New york

Posts: 6
Thanks: 0

Three friends are playing paintball. Maria is a good shot, hitting her target half of the time; Fan isn’t bad, hitting her target one third of the time; Peter is new to the game, and only hits his target one sixth of the time. Each round, all three players take a single shot at the same time, and anyone who is hit leaves the game. When all three people are still in the game, all players target the best shot out of their opponents; that is, Fan and Peter both target Maria, while Maria targets Fan. Write out a Markov chain model of the game, including a transition diagram and the matrix of transition probabilities. What is the probability that Peter is still in the game after three rounds of play?

this was given to me in a linear algebra class and it is the only homework question I didn't quite know how to approach... I managed to draw a diagram but I don't really know how to put this into a matrix? I know for the first round ( or row) it would be =

t1=1/2F 0P 1/2M and I assume the following rows would be T+1 and T+2 but how do you computer the probabilities? Thanks!

Last edited by skipjack; September 5th, 2019 at 01:53 PM.
Bioobird is offline  

  My Math Forum > High School Math Forum > Probability and Statistics

markov, probability, question

Search tags for this page
Click on a term to search for related topics.
Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
Markov chain probability Shanonhaliwell Advanced Statistics 4 August 12th, 2018 02:08 AM
Markov Probability Matriv joe8032 Advanced Statistics 0 March 20th, 2014 04:17 AM
question of recurrent Markov chains mark5907 Advanced Statistics 0 February 6th, 2012 10:39 PM
Probability- Markov Chains crazy_craig Advanced Statistics 3 February 6th, 2012 05:09 AM
Proof that sum of N Markov processes is a Markov process jjkraak Advanced Statistics 5 November 30th, 2011 06:35 AM

Copyright © 2019 My Math Forum. All rights reserved.