Definition of Markov chain

1. Noun. A Markov process for which the parameter is discrete time values.

Exact synonyms: Markoff Chain
Generic synonyms: Markoff Process, Markov Process

Lexicographical Neighbors of Markov Chain

marking
marking
markings
marking ink
markka
markkaa
markkas
markman
markmen
Markoff
Markoff chain
Markoff process
Markov
Markova
Markovian
Markov chain (current term)
Markov chains
Markov process
Markov process
Marks
marksman
marksmanship
marksmanships
marksmen
markswoman
markswomen
markup
markups
markup language
markweed

Other Resources:

Search for Markov chain on Dictionary.com!Search for Markov chain on Thesaurus.com!Search for Markov chain on Google!Search for Markov chain on Wikipedia!

Search