Jump to content

Markov process

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by RamaKrishnaReddy.Aleti (talk | contribs) at 07:30, 10 October 2013. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Markov process example

In probability theory and statistics, a Markov process or Markoff process, named after the Russian mathematician Andrey Markov, is a stochastic process satisfying a certain property, called the Markov property. A Markov process can be thought of as 'memoryless': loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history. I.e., conditional on the present state of the system, its future and past are independent.[1]

Markov processes arise in probability and statistics in one of two ways. A stochastic process, defined via a separate argument, may be shown mathematically to have the Markov property, and as a consequence to have the properties that can be deduced from this for all Markov processes. Alternately, in modelling a process, one may assume the process to be Markov, and take this as the basis for a construction. In modelling terms, assuming that the Markov property holds is one of a limited number of simple ways of introducing statistical dependence into a model for a stochastic process in such a way that allows the strength of dependence at different lags to decline as the lag increases.

Often, the term Markov chain is used to mean a Markov process which has a discrete (finite or countable) state-space. Usually a Markov chain would be defined for a discrete set of times (i.e. a discrete-time Markov chain)[2] although some authors use the same terminology where "time" can take continuous values.[3] Also see continuous-time Markov process.

Markov property

The general case

Let be a probability space with a filtration , for some (totally ordered) index set ; and let be a measurable space. An S-valued stochastic process adapted to the filtration is said to possess the Markov property with respect to the if, for each and each with s < t,

[4]

A Markov process is a stochastic process which satisfies the Markov property with respect to its natural filtration.

For discrete-time Markov chains

In the case where is a discrete set with the discrete sigma algebra and , this can be reformulated as follows:

.

Examples

Gambling

Suppose that you start with $10 in poker chips, and you repeatedly wager $1 on a (fair) coin toss indefinitely, or until you lose all of your poker chips. If represents the number of dollars you have in chips after n tosses, with , then the sequence is a Markov process. If I know that you have 12 chips now, then it would be expected that with even odds, you will either have 11 or 13 chips after the next toss. This guess is not improved by the added knowledge that you started with 10 chips, then went up to 11, down to 10, up to 11, and then to 12.

A birth-death process

Suppose that you are popping one hundred kernels of popcorn, and each kernel will pop at an independent, exponentially-distributed time. Let denote the number of kernels which have popped up to time t. Then this is a continuous time, non-homogenous Markov process. If after some amount of time, I want to guess how many kernels will pop in the next second, I need only to know how many kernels have popped so far. It will not help me to know when they popped, so knowing for previous times t will not inform my guess.

The process described here is an approximation of a Poisson process - Poisson processes are also Markov.

A non-Markov example

Suppose that you have a coin purse containing five quarters (each worth 25c), five nickels (each worth 5c) and five dimes (each worth 10c), and one-by-one, you randomly draw coins from the purse and set them on a table. If represents the total value of the coins set on the table after n draws, with , then the sequence is not a Markov process.

To see why this is the case, suppose that in your first six draws, you draw all five nickels, and then a quarter. So . If we know not just , but the earlier values as well, then we can determine which coins have been drawn, and we know that the next coin will not be a nickel, so we can determine that with probability 1. But if we do not know the earlier values, then based only on the value we might guess that we had drawn four dimes and two nickels, in which case it would certainly be possible to draw another nickel next. Thus, our guesses about are impacted by our knowledge of values prior to .

In analysing switching by Business Class

In analysing switching by Business Class customers between airlines the following data has been obtained by British Airways (BA):

                                Next flight by
                                BA      Competition     

Last flight by BA 0.85 0.15

                Competition     0.10    0.90

For example if the last flight by a Business Class customer was by BA the probability that their next flight is by BA is 0.85. Business Class customers make 2 flights a year on average.

Currently BA have 30% of the Business Class market. What would you forecast BA's share of the Business Class market to be after two years?

Solution

We have the initial system state s1 given by s1 = [0.30, 0.70] and the transition matrix P is given by

P = | 0.85 0.15 |2 = | 0.7375 0.2625 |

       |  0.10    0.90  |       |  0.1750    0.8250  |

where the square term arises as Business Class customers make 2 flights a year on average.

Hence after one year has elapsed the state of the system s2 = s1P = [0.34375, 0.65625]

After two years have elapsed the state of the system = s3 = s2P = [0.368, 0.632]

and note here that the elements of s2 and s3 add to one (as required).

So after two years have elapsed BA's share of the Business Class market is 36.8%

Markovian representations

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X. Mathematically, this takes the form:

If Y has the Markov property, then it is a Markovian representation of X.

An example of a non-Markovian process with a Markovian representation is an autoregressive time series of order greater than one.[citation needed]

The band Bad Religion has a song titled "The Markovian Process" on their album Stranger Than Fiction. The credited writer of the song, lead singer and current UCLA professor Greg Graffin, has a Ph.D. in zoology and degrees in anthropology and geology.

See also

References

  1. ^ Markov process (mathematics) - Britannica Online Encyclopedia
  2. ^ Everitt,B.S. (2002) The Cambridge Dictionary of Statistics. CUP. ISBN 0-521-81099-X
  3. ^ Dodge, Y. The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
  4. ^ Durrett, Rick. Probability: Theory and Examples. Fourth Edition. Cambridge: Cambridge University Press, 2010.
  • Weisstein, Eric W. "Markov process". MathWorld.