My Math Forum  

Go Back   My Math Forum > College Math Forum > Advanced Statistics

Advanced Statistics Advanced Probability and Statistics Math Forum


Reply
 
LinkBack Thread Tools Display Modes
April 3rd, 2012, 09:35 AM   #1
Newbie
 
Joined: Apr 2012

Posts: 4
Thanks: 0

help in markov chain

Hi i have an exercise and have a trouble with it if anyone can help me i would appreciate it plz help if you can.

This is the exercise:

Suppose the weather in a city in the (n) day is described from a Markov chain and it depend on the (n-1) day as follows:

1. if in (n-1) day rains then in (n) day rains with probability 60% or will be sunshine with probability 40%

2. if in (n-1) day has sunshine then in (n) day will have sunshine with probability 60% or will have clouds with probability 40%.

3. if in (n-1) day have clouds then in (n) day will rain with probability 20% , will have sunshine with probability 20% , will have clouds with probability 55% or will snow with probability 5%.

4. if in (n-1) day snows then in (n) day will have clouds with probability 90% or will snow with probability 10%


The exercise asks for the following:

1. Markov chain transition matrix
2. Solve the balanced equations and calculate the steady-state probabilities .

Thanks in advance
legendoulis is offline  
 
April 3rd, 2012, 09:55 AM   #2
Newbie
 
Joined: Apr 2012

Posts: 4
Thanks: 0

Re: help in markov chain

i cant make both the Markov chain transition matrix and Solve the balanced equations and calculate the steady-state probabilities .
legendoulis is offline  
April 3rd, 2012, 02:21 PM   #3
Math Team
 
Joined: Dec 2006
From: Lexington, MA

Posts: 3,267
Thanks: 408

Re: help in markov chain

Hello, legendoulis!

Quote:
Suppose the weather in a city on day n is described by a Markov chain
[color=beige]. . [/color]and it depends on the (n - 1) day as follows:

1. If on day (n - 1), it rains, then on day n,
[color=beige]. . [/color]it rains with probability 60%, or will be fair with probability 40%

2. If on day (n-1), it is fair, then on day n,
[color=beige]. . [/color]it will be fair with probability 60%, or be cloudy with probability 40%.

3. If on day (n - 1), it is cloudy, then on day n,
[color=beige]. . [/color]it will rain with probability 20%, will be fair with probability 20%,
[color=beige]. . [/color]will be cloudy with probability 55%, or will snow with probability 5%.

4. If on day (n - 1), it snows, then on day n,
[color=beige]. . [/color]it will be cloudy with probability 90%, or will snow with probability 10%.

The exercise asks for the following:
[color=beige]. . [/color](1) Markov chain transition matrix
[color=beige]. . [/color](2) Solve the balanced equations and calculate the steady-state probabilities .



[color=beige]. . [/color]







[color=beige]. . [/color]




[color=beige]. . [/color]











[color=beige]. . [/color]

soroban is offline  
April 3rd, 2012, 03:48 PM   #4
Newbie
 
Joined: Apr 2012

Posts: 4
Thanks: 0

Re: help in markov chain

thank you very very match soroban for the answer.

I have one more thing:

We have to simulate the Markov chain that we created i will explain:

You can start the chain from whichever situation you want.
1. In every step of the simulation we must create a random number .
2. So when we finish we use the transition matrix to go to the next situation
and so on
The simulation must have time equal to 1080 steps of the previous (1. and 2.)
legendoulis is offline  
April 4th, 2012, 11:28 AM   #5
Newbie
 
Joined: Apr 2012

Posts: 4
Thanks: 0

Re: help in markov chain

We need to find the Markov chain diagram from the data that we have
legendoulis is offline  
Reply

  My Math Forum > College Math Forum > Advanced Statistics

Tags
chain, markov



Search tags for this page
Click on a term to search for related topics.
Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Help with Hidden Markov chain dr_romix Advanced Statistics 2 October 16th, 2012 06:36 PM
Markov Chain problem 450081592 Advanced Statistics 1 March 6th, 2012 04:17 PM
Markov chain butabi Advanced Statistics 1 February 12th, 2012 03:20 PM
Markov Chain Epidemics CraigSager Advanced Statistics 5 March 23rd, 2011 10:10 AM
Markov Chain matrix Turloughmack Linear Algebra 0 February 7th, 2011 05:15 AM





Copyright © 2019 My Math Forum. All rights reserved.