Feedback
Markov chain
Meaning
Noun
●
In Probability theory:
A discrete-time
stochastic process
with the Markov property.
Sourced from
Wiktionary
Join 10 million students and professionals writing 70% faster at QuillBot.com
Start writing better