Markov chains Flashcards

(3 cards)

1
Q

What is a Markov chain?

A

A Markov Chain is a sequence of events or states where the probability of moving to the next state depends only on the current state, not on any previous states. This property is known as the “memoryless” or Markov property.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are transition probabilities?

A

Transition probabilities quantify the likelihood of moving from one state to another. They are organized into a transition matrix.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a transition matrix?

A
  • a square array where each entry represents the probability of transitioning from a specific row state to a specific column state
  • for a system with n states, the matrix is an n×n grid
  • each row sums to 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly