Medical Definition of Markov chains
1.
A stochastic process in which the conditional probability for a state at any future instance, given the present state, is unaffected by knowledge of the past history of events.
(12 Mar 2008)
Lexicographical Neighbors of Markov Chains
Other Resources: