Edit made on March 06, 2009 by DerekCouzens at 12:57:23

~~Deleted text in red~~
/
Inserted text in green

WW

HEADERS_END

A Markov Chain is a collection of states that you move between,

making choices according to probability ~~distributions.~~ distributions, in which the probability of the current state occurring depends on the previous state. More to

follow.

Markov chains are now also used as a technique of spam filtering.

* http://www.google.co.uk/search?q=Markov+Chain