Language:
Free Online Dictionary|3Dict

Markov chain

Source : WordNet®

Markov chain
     n : a Markov process for which the parameter is discrete time
         values [syn: {Markoff chain}]

Source : Free On-Line Dictionary of Computing

Markov chain
     
         (Named after {Andrei Markov}) A model of
        sequences of events where the probability of an event
        occurring depends upon the fact that a preceding event
        occurred.
     
        A {Markov process} is governed by a Markov chain.
     
        In {simulation}, the principle of the Markov chain is applied
        to the selection of samples from a probability density
        function to be applied to the model.  {Simscript} II.5 uses
        this approach for some modelling functions.
     
        [Better explanation?]
     
        (1995-02-23)
Sort by alphabet : A B C D E F G H I J K L M N O P Q R S T U V W X Y Z