Home
English Dictionary
examples: 'day', 'get rid of', 'New York Bay'
Dictionary
Add this Dictionary
to your website!
See more options...
Markov chain
Listen
Markov chain
noun
1.
A Markov process for which the parameter is discrete time values.
Synonym:
Markoff chain
.
See images of 'Markov chain'
Etymology of 'Markov chain'
WordNet 3.0 © 2010 Princeton University
Advanced search
Find words:
Starting with
Ending with
Containing
Matching a pattern
Synonyms
Antonyms
Quotes
Words linked to
only single words
Share
|
Words linked to "Markov chain" :
Markov process
,
Markoff process
Copyright © 2024 Free Translator.org