Lexic.us
Definition of Markov chain
1.
Noun.
A Markov process for which the parameter is discrete time values.
Exact synonyms:
Markoff Chain
Generic synonyms:
Markoff Process
,
Markov Process
Lexicographical Neighbors of Markov Chain
marking
marking
markings
marking ink
markka
markkaa
markkas
markman
markmen
Markoff
Markoff chain
Markoff process
Markov
Markova
Markovian
Markov chain
(current term)
Markov chains
Markov process
Markov process
Marks
marksman
marksmanship
marksmanships
marksmen
markswoman
markswomen
markup
markups
markup language
markweed
Other Resources:
Quick Links
Definitions of Markov chain
Markov chain Pictures
Nearby Words
More Information
Search