|
Definition of Markoff chain
1. Noun. A Markov process for which the parameter is discrete time values.
Lexicographical Neighbors of Markoff Chain
Literary usage of Markoff chain
Below you will find example usage of this term as found in modern and/or classical literature:
1. R.R. Bahadur's Lectures on the Theory of Estimation by Raghu Raj Bahadur, Stephen M. Stigler, Wing Hung Wong, Daming Xu (2002)
"... 1) x (0, 1) and that a markoff chain with transition probability matrix as
above starts at Т and is observed for n one-step transitions. ..."