What is a Markov chain?
A Markov Chain is a sequence of events or states where the probability of moving to the next state depends only on the current state, not on any previous states. This property is known as the “memoryless” or Markov property.
What are transition probabilities?
Transition probabilities quantify the likelihood of moving from one state to another. They are organized into a transition matrix.
What is a transition matrix?