|
Definition of Markoff process
1. Noun. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.
Exact synonyms: Markov Process
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process