Markov process n : a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state [syn: {Markoff process}]
Markov processA process in which the sequence of events can be described by a {Markov chain}. (1995-02-23)