Markov Chains

Junlin Liu
Nov 12, 2021

--

Model for systems that change over time in a random manner

A Markov chain is a special type of stochastic process, defined in terms of the
conditional distributions of future states given the present and past states. If the current state only depends on the previous state.

A sequence of random variables 𝑋1, 𝑋2,… is called a stochastic process or random process with discrete time parameter.

Homogeneous Markov Chain

The initial probability vector v gives the distribution of the state at time 1. P^n gives the probabilities of transitions over n time periods. vP^n gives the distribution of the state at time n + 1.

A stationary distribution is a probability vector v such that vP =v.

Example Problems

--

--

Junlin Liu

Data Scientist in Finance. Take care of the memories, polish knowledge.