Get Your Premium Membership

Markov Process

[n] a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state


Related Information

More Markov Process Links

Synonyms

Markoff process



Book: Reflection on the Important Things