|
Definition of Transition matrix
1. Adjective. (mathematics stochastic processes of a Markov chain) a square matrix whose rows consist of nonnegative real numbers, with each row summing to . Used to describe the transitions of a Markov chain; its element in the 'th row and 'th column describes the probability of moving from state to state in one time step. ¹
¹ Source: wiktionary.com