A Markov process for which the parameter is discrete time values. Synonym: Markov chain.
|
|
|
|
|
Words linked to "Markoff chain" : Markoff process, Markov chain, Markov process |
Copyright © 2025 Free Translator.org |