Markov Chain Analysis is a technique to find probabilities of future state of an event, by knowing current state and probability of transition between states. It is applicable in cases, where an event can take any of the multiple possible states. As a simple example, weather can be sunny, rainy or cloudy and we know the transition probability between each pair. If it is sunny today, it can be sunny, rainy or cloudy on subsequent day with probabilities of 0.7, 0.1 and 0.2 respectively. We can find the probability of weather being sunny, rainy or cloudy at any future day using Markov Chain Analysis.
This technique has several applications, like Market Share analysis, Machine failure, Weather forecast, Bond Rating movement and so on.
This analysis is based on an assumption of states or conditions, which are collectively exhaustive and mutually exclusive. Hence a consumer can shop at only 3 retailers, and with only one retailer at any point of time.
State of a system is represented by a State Vector. If we have three retailers – A, B and C then if a customer is shopping at A currently, then this state is shown as below.
Current State = (1, 0, 0)
In addition to that, there is a Transition Probability associated with each pair. As an example, customer shopping at A can shop at A, B or C the next week with likelihood of 27%, 46% or 27% respectively. We would have similar transition probability for B and C too. Hence for n states or conditions, we would have a n*n Transition Probability Matrix.
Please see the same case in the Example Sheet. This refers to market share of three retailers, and analyzes the probability of a customer shopping at them every subsequent weeks. For the Current State, where Customer shops at A, and a given Transition Probability, the state probabilities for three subsequent weeks are shown in the picture below.
Often the probabilities converge to a final state after sufficiently large number of periods. This is called the “Equilibrium Condition”.
Equilibrium condition by its definition means that probabilities do not change. Hence if we denote probabilities of shopping at retailer as Pa, Pb and Pc, then
[Pa, Pb, Pc] = T[Pa, Pb, Pc] where T is the transition probability. By simple matrix algebra, we would get the following three equations.
Pa = 0.27Pa + 0.72Pb + 0.43Pc -------------- 1
Pb = 0.46Pa + 0.14Pb + 0.43Pc -------------- 2
Pc = 0.27Pa + 0.14Pb + 0.14Pc -------------- 3
In addition, we would get an additional equation, which arises due to collectively exhaustive nature of the states.
Pa + Pb + Pc = 1 ----------------- 4
In order to solve for them, we need to drop one equation (except 4), so that we have same number of unknowns and equations (3 each in this case).
It is perfectly alright to drop any of first three equations, as they are mathematically interrelated.
Dropping eq. 3, and solving for the rest, we would get following values.
[Pa, Pb, Pc] = [0.457, 0.344, 0.199]; sum of these is equal to 1.
Please follow the calculation in the Example Sheet in tab “Equilibrium State”. One sanity check is to multiply these values with Transition Probability, and we should get the same values.
There is a very important concept of “Absorbing State” in Markov Chain Analysis. Absorbing State is something, which does not change. As an example, state like “Bad Debt” would continue being in that category for all future states.
As an example, we can have four states in an Account Receivable System as following.
Overdue, due for more than 6 months
Overdue, due more than 1 months
Bad Debt, Due for more than 1 year
Paid, no obligations
Out of the above, Paid and Bad Debt are Absorbing States, as once a Customer is put into these states, he continues being in that state. Transition Probability of such a scenario would look like following. We see the probability for absorbing states are 1 for that particular state and zero otherwise. Such a case can arise in several scenarios. ng
Equilibrium State Calculations for cases involving Absorbing States have very powerful use. Absorbing States only can be present in the Equilibrium State for such cases. Hence ultimately a customer would be either in Bad Debt or Paid category, with differing probabilities. If we could find these probabilities, then we would know the fraction of customers falling in either of these states in the long run. This helps a lender in managing his assets in a better way.
Equilibrium Condition for these cases are more involved, and needs rigorous Matrix manipulations. We would leave the discussion as this stage, with this brief introduction.
We would briefly introduce another advanced concept in Markov Chain Analysis, called Hidden Markov Models. Here the states are not known (hidden), rather a Symbol is known for a given state. Hence rather than working in terms of STATES, we work on SYMBOLS which are dependent on STATES. In addition to TRANSITION PROBABILITY, an EMISSION PROBABILITY is also required, i.e. symbol b is observed when state k is present. Given the SYMBOL sequence, STATE sequence is hidden and idea is to Find the most likely sequence which results in a given symbol sequence
As example, we can have a Transition matrix for Weather (3*3) – snow, storm, clear and an emission matrix of symbols of flight performance (on time, delayed). The objective could be to know to know the most likely sequence of Weather States, for an observed sequence of Flight performances.