Markov Chains

Model for systems that change over time in a random manner

A Markov chain is a special type of stochastic process, defined in terms of the
conditional distributions of future states given the present and past states. If the current state only depends on the previous state.

A sequence of random variables 𝑋1, 𝑋2,… is called a stochastic process or random process with discrete time parameter.

The initial probability vector v gives the distribution of the state at time 1. P^n gives the probabilities of transitions over n time periods. vP^n gives the distribution of the state at time n + 1.

A stationary distribution is a probability vector v such that vP =v.

Example Problems

--

--

Data Scientist in Finance. Take care of the memories, polish knowledge.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Junlin Liu

Data Scientist in Finance. Take care of the memories, polish knowledge.