#1 Roulette Forum & Message Board | www.RouletteForum.cc

Resources & Downloads => Mathematics => Topic started by: vladir on Mar 22, 07:07 AM 2016

Title: Markov Chains explained
Post by: vladir on Mar 22, 07:07 AM 2016
link:://setosa.io/ev/markov-chains/
Title: Re: Markov Chains explained
Post by: NextYear on Mar 22, 10:24 AM 2016
Thanks, a little bit of science shouldn't kill nobody...
Title: Re: Markov Chains explained
Post by: Rourke on Mar 22, 01:34 PM 2016
Very interesting post Vladir :-)

However... I don't think you can apply the following to roulette:

QuoteWe can minic this "stickyness" with a two-state Markov chain. When the Markov chain is in state "R", it has a 0.9 probability of staying put and a 0.1 chance of leaving for the "S" state. Likewise, "S" state has 0.9 probability of staying put and a 0.1 chance of transitioning to the "R" state.