|
Definition of Markov process
1. Noun. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.
Exact synonyms: Markoff Process
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process
Definition of Markov process
1. Noun. (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states. ¹
¹ Source: wiktionary.com
Medical Definition of Markov process
1.