Dictionary

Markovian

adjective Mar·kov·ian \mär-ˈkō-vē-ən, -ˈk-\

Definition of MARKOVIAN

:  of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states

Variants of MARKOVIAN

Mar·kov·ian or Mar·kov \ˈmär-ˌkf, -ˌkv\ also Mar·koff \ˈmär-ˌkf\
ADVERTISEMENT

First Known Use of MARKOVIAN

1944

Rhymes with MARKOVIAN

Browse

Next Word in the Dictionary: Markov processPrevious Word in the Dictionary: Markov chainAll Words Near: Markovian
ADVERTISEMENT
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears