Definition of Markov

1. Noun. Russian mathematician (1856-1922).

Exact synonyms: Andre Markoff, Andrei Markov, Markoff
Generic synonyms: Mathematician
Derivative terms: Markovian

Lexicographical Neighbors of Markov

Mark Antony
Mark Clark
Mark Hopkins
Mark Rothko
Mark Tobey
Mark Twain
Mark Wayne Clark
Markab
Markan
Markaz-ud-Dawa-wal-Irshad
Markie
Markland
Markoff
Markoff chain
Markoff process
Markov (current term)
Markov chain
Markov chains
Markov jump process
Markov process
Markova
Markovian
Markowitz
Marks
Marks and Sparks
Markus
Marky
Marla
Marlboro
Marlborough

Literary usage of Markov

Below you will find example usage of this term as found in modern and/or classical literature:

1. State of the Art in Probability and Statistics: Festschrift for Willem R by Mathisca de Gunst, Chris Klaassen, A. W. van der Vaart (2001)
"Keywords and phrases: Area-interaction process, Berman-Turner device, Conditional intensity, Directed Markov point processes, Efficiently estimable ..."

2. Statistics, Probability, and Game Theory: Papers in Honor of David Blackwell by David Blackwell, Thomas Shelburne Ferguson, Lloyd S. Shapley, James B. MacQueen (1996)
"It is desired to create a Markov process as a model for some natural process. Starting with what may be a somewhat vague idea of its transition probability ..."

3. Statistics in Molecular Biology and Genetics: Selected Proceedings of a 1997 by Françoise Seillier-Moiseiwitsch (1999)
"The challenging part is to approximate the posterior, and we do this by constructing a Markov chain having the posterior as its invariant distribution, ..."

4. Adaptive Designs: Selected Proceedings of a 1992 Joint Ams-Ims-Siam Summer by Nancy Flournoy, William F. Rosenberger, American Mathematical Society, Institute of Mathematical Statistics (1995)
"The urn model may, in turn, be embedded in a Markov branching process, and results from the theory of these processes may then be used to prove results for ..."

5. Stochastic Inequalities by Moshe Shaked, Yung Liang Tong (1992)
"(Xt; t > 0) is an irreducible finite-state reversible Markov chain in continuous time. The state space is / and the transition rate matrix is Q = (q(i,j);i ..."

6. Dynamics & Stochastics: Festschrift in Honour of M.S. Keane by Dee Denteneer, F. den Hollander, M. S. Keane, Evgeny Verbitskiy (2006)
"This is proven by using the combinatorics of the binomial coefficients on the regenerative construction of the Markov measure. 1. ..."

Other Resources:

Search for Markov on Dictionary.com!Search for Markov on Thesaurus.com!Search for Markov on Google!Search for Markov on Wikipedia!

Search