**What is Markov Chain Analysis ?**

Markov Analysis is a way of analyzing the current movement of some variable in an effort to forecast its future movement. As a management tool, Markov analysis has been used during the last several years, mainly as a marketing aid for examining and predicting the behavior of customers from the standpoint of the loyalty to one brand and their switching patterns to other brands. Though this is a leading use of the technique, there are, however, other areas in which application of Markov Analysis has produced significant results.

**For Example :**

- In the field of accounting, it can be applied to the behavior of accounts receivable.
- In production, it is helpful in evaluating alternative maintenance policies, queuing systems and work assignments. The technique is also useful to the Personnel department in determining future manpower requirements of an organisation.

**Markov Chains**

Markov chain (process) is a stochastic process which is used to analyse decision problems in which the occurrence of a specific event depends on the occurrence of the event immediately prior to the current event. A Markov process is a time-based process where the probability of a state on the next trial (time period) depends only on the state in the current trial (time period) and not on how one got to the present state.

Virtually, the Markov process is a sequence of n experiments in which each experiment has n possible. outcomes X1, X2, ....Xn. Each individual outcome is called a state, and the probability (that a particular outcome occurs) depends only on the probability of the outcomes of the preceding experiment.

**Properties of Markov Chains **

Salient properties of Markov Chain are as follows :

- The set of all possible states of a stochastic (probabilistic) system is finite.
- The variables move from one state to another and the probability of transition from a given state is dependent only on the present state of the system, not in which it was reached.
- States are both collectively exhaustive and mutually exclusive.
- The probabilities of reaching to various states from any given state are measurable and remain constant over a time (i.e., throughout the system's operation).
- The transition probabilities of moving to alternative states in the next period, given a state in the current time period must sum to 1.0.

**Assumptions of Markov Chains**

Assumptions of Markov Chain are as follows :

**1) Connectivity :**

All the alternatives in the set are connected. This implies that alternatives should be related to each other by the preference/indifference relations.

**2) Consistency :**

The preference relation between two alternatives cannot be reversed at some point in time. This implies that if the consumer prefers A to B, he cannot prefer B to A or be indifferent between A and B.

**3) Transitivity :**

If there are three alternatives, A to B and B to C, then one must prefer A to C.

**Classification of the States in a Markov Chain **

The states of a Markov chain can be classified based on the transition probability P

_{ij }of P.**1) Absorbing :**

A state j is absorbing if it returns to itself with certainty in one transition, that is, P

_{ij }= 1.**2) Transient :**

A state j is transient if it can reach another state but cannot itself be reached back from another state.

Mathematically, this will happen if lim P

_{ij }^{(n) }= 0 for all i.**3) Recurrent :**

A state j is recurrent if the probability of being revisited from other states is 1. This can happen if, and only if, the state is not transient.

**4) Periodic :**

A state j is periodic with period t > 1 if a return is possible only in t, 2t, 3t,...steps. This means that P

_{ij }^{(n)}= 0 whenever n is not divisible by t.**State-Transition Matrix**

A state-transition matrix is a rectangular array which summarizes the transition probabilities for a given Markov process. In such a matrix the rows identify the current state of the system being studied and the columns identify the alternative states to which the system can move.

Let Ei = state i of a stochastic process; (i = 1, 2, ..., m) and P

_{ij}= transition probability of moving from state Ei to state Ej in one step.**Then, a one stage state-transition matrix P can be described as given below :**

In the transition matrix of the Markov chain, P

_{ij}= 0 when no transition occurs from state i to state j; and P_{ij}= 1 when the system is in state i, it can move only to state j at the next transition.**Each row of the transition matrix represents a one-step transition probability distribution over all states. This means :**

**P**

_{i1}+P_{i2}+...+P_{im }= 1 for all i and 0 ≤ P_{ij}≤ 1**Transition Diagram**

A transition diagram shows the transition probabilities that can occur in any situation. Such a diagram is given in figure below.

The arrows from each state indicate the possible states to which a process can move from the given state. The following transition matrix corresponds to the adjoining diagram :

A zero element in the matrix indicates that the transition is impossible.

**Applications Related to Management Functional Areas **

The major applications of Markov chain are in the following areas :

**1) Personnel :**

Determining future manpower requirements of an organisation taking into consideration retirements, deaths, resignations, etc.

**2) Finance :**

Determining allowances for doubtful accounts.

**3) Production :**

Helpful in evaluating alternative maintenance policies, queuing systems and work assignments.

**4) Marketing :**

Useful in analyzing and predicting customer's buying behavior in terms of loyalty to a particular product brand, switching patterns to other brands; and market share of the company versus its competitors.

## 0 Comments

if you have any doubts, please let me know