Definition of Markoff chain

1. Noun. A Markov process for which the parameter is discrete time values.

Exact synonyms: Markov Chain
Generic synonyms: Markoff Process, Markov Process

Lexicographical Neighbors of Markoff Chain

markhoor
markhoor
markhoors
markhor
markhors
marking
marking
markings
marking ink
markka
markkaa
markkas
markman
markmen
Markoff
Markoff chain
Markoff process
Markov
Markova
Markovian
Markov chain
Markov chains
Markov process
Markov process
Marks
marksman
marksmanship
marksmanships
marksmen
markswoman

Other Resources:

Search for Markoff chain on Dictionary.com!Search for Markoff chain on Thesaurus.com!Search for Markoff chain on Google!Search for Markoff chain on Wikipedia!

Search