Source : WordNet®
Markov chain
n : a Markov process for which the parameter is discrete time
values [syn: {Markoff chain}]
Source : Free On-Line Dictionary of Computing
Markov chain
(Named after {Andrei Markov}) A model of
sequences of events where the probability of an event
occurring depends upon the fact that a preceding event
occurred.
A {Markov process} is governed by a Markov chain.
In {simulation}, the principle of the Markov chain is applied
to the selection of samples from a probability density
function to be applied to the model. {Simscript} II.5 uses
this approach for some modelling functions.
[Better explanation?]
(1995-02-23)