Copy page URL Share on Twitter Share on WhatsApp Share on Facebook
Get it on Google Play
Meaning of word markov process from English dictionary with examples, synonyms and antonyms.

markov process   noun

Meaning : A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.

Synonyms : markoff process