Definition of Markov process

1. Noun. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.

Exact synonyms: Markoff Process
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process

Definition of Markov process

1. Noun. (probability theory) A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states. ¹

¹ Source: wiktionary.com

Medical Definition of Markov process

1. A stochastic process in which the conditional probability for a state at any future instance, given the present state, is unaffected by knowledge of the past history of events. (12 Mar 2008)

Lexicographical Neighbors of Markov Process

Mark Tobey
Mark Twain
Mark Wayne Clark
Markab
Markan
Markaz-ud-Dawa-wal-Irshad
Markie
Markland
Markoff
Markoff chain
Markoff process
Markov
Markov chain
Markov chains
Markov jump process
Markov process (current term)
Markova
Markovian
Markowitz
Marks
Marks and Sparks
Markus
Marky
Marla
Marlboro
Marlborough
Marlburian
Marlburians
Marleen
Marlene

Other Resources:

Search for Markov process on Dictionary.com!Search for Markov process on Thesaurus.com!Search for Markov process on Google!Search for Markov process on Wikipedia!

Search