search

Markov chain

Meaning

Noun

In Probability theory:
A discrete-time
with the Markov property.
Sourced from
Wiktionary
Join 10 million students and professionals writing 70% faster at QuillBot.com