Markoff chain n : a Markov process for which the parameter is discrete time values [syn: {Markov chain}]
Copyright © 2024 3Dict.net