Lexic.us
Definition of Markoff chain
1.
Noun.
A Markov process for which the parameter is discrete time values.
Exact synonyms:
Markov Chain
Generic synonyms:
Markoff Process
,
Markov Process
Lexicographical Neighbors of Markoff Chain
markhoor
markhoor
markhoors
markhor
markhors
marking
marking
markings
marking ink
markka
markkaa
markkas
markman
markmen
Markoff
Markoff chain
Markoff process
Markov
Markova
Markovian
Markov chain
Markov chains
Markov process
Markov process
Marks
marksman
marksmanship
marksmanships
marksmen
markswoman
Other Resources:
Quick Links
Definitions of Markoff chain
Markoff chain Pictures
Nearby Words
More Information
Search