Home Markov Chains
Post
Cancel

Markov Chains

What is Markov Chain?

  • A system that changes its status.
  • One important rule: the next status of the system only depends on its current status.
  • can be drawn as a state diagram as well as written as a transition matrix.
    • state diagram represents all possible status and associated probabilities.
    • transition matrix
      • represent the state diagram
      • probability from one state to another
      • $A$

Equilibrium

  • the probabilities of the next status doesn’t change any more
  • the probability of state at equilibrium = stationary distribution
  • assume such equilibrium probability $\pi$
    • fixed point: $A \pi = \pi$
    • where $\pi$:
      • eigenvector of the matrix
      • probabilities of each status the system could be in, assuming equilibrium stage.
    • using two equations to work out the value of $\pi$:
      • $A \pi = \pi$
      • sum of probability is 1

Transience and Recurrence of Markov Chains

  • transient state
    • a process beginning in a transient state will never return to that state
  • recurrent state
    • a process beginning in a recurrent state will return to that state

Ergodic Markov Chains

  • an aperiodic Markov Chain, all states of which are positive recurrent
  • irreducible

$n$-step Transition Matrix

  • $P_{ij}(n) = A^n_{ij}$

    Chapman-Kolmogorov Theorem

  • $P_{ij}(n) = \sum_{k} P_{ik} (r) * P_{kj}(n-r)$
  • $A^{\infty}{ij} = P{ij}(\infty)$
    • probability of being at $j$ after infinite steps starting from $i$
    • probability of being at $j$ after infinite steps

References

  • https://www.youtube.com/watch?v=i3AkTO9HLXo&list=PLM8wYQRetTxBkdvBtz-gw8b9lcVkdXQKV&index=1
  • https://brilliant.org/wiki/transience-and-recurrence/
  • https://brilliant.org/wiki/ergodic-markov-chains/
  • https://www.youtube.com/watch?v=Zo3ieESzr4E&list=PLM8wYQRetTxBkdvBtz-gw8b9lcVkdXQKV&index=3
This post is licensed under CC BY 4.0 by the author.