Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

The state of a process changes daily according to a two-state Markov chain. If t

ID: 3222666 • Letter: T

Question

The state of a process changes daily according to a two-state Markov chain. If the process is in state i during one day, then it is in state j the following day with probability P_ij, where P_0, 0 = 0.4, P_0, 1 = 0.6, P_1, 0 = 0.2, P_1, 1 = 0.8 Every day a message is sent. If the state of the Markov chain that day is i then the message sent is "good" with probability p_i and is "bad" with probability q_i = 1 - p_i, i = 0, 1 (a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday? (b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday? (c) In the long run, what proportion of messages are good? (d) Let Y_n equal 1 if a good message is sent on day n and let it equal 2 otherwise. Is {Y_n, n greaterthanorequalto 1} a Markov chain? If so, give its transition probability matrix. If not, briefly explain why not.

Explanation / Answer

(a) Prob. that good message is sent on tuesday=p0 P00 + p1 P01 = 0.4p0 + 0.6p1.

(b)First find out the P4 i.e [ 0.25 0.75;0.25 0.75]

Now Prob. that good message is sent on friday =0.25p0+0.75p1

(c) Long run proportion of good messages

p00 + p11 = p0*( 0.2/ (0.6+0.2)) + p1*( 0.6/ (0.6+0.2) )= (1/ 4) p0 + (3 /4) p1.

d)Yn is not a Markov chain.

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote