WELCOME GUEST! JOIN THE #1 ROULETTE FORUM TO SHARE AND DISCUSS ROULETTE SYSTEMS. ALL ADS ARE REMOVED AFTER LOGIN. REGISTER HERE

Author Topic: Markov Chains explained  (Read 587 times)

0 Members and 1 Guest are viewing this topic.

vladir

  • 250+ posts Member
  • ****
  • Posts: 453
    • View Profile
  • Rated: 0
Markov Chains explained
« on: March 22, 2016, 11:07:54 AM »
"In God we trust; all others must bring data", W. Edwards Deming


NextYear

  • 500+ posts Member
  • *****
  • Posts: 706
  • Gender: Male
  • Roulette Forum .cc | Member
    • View Profile
  • Rated: +13
Re: Markov Chains explained
« Reply #1 on: March 22, 2016, 02:24:23 PM »
Thanks, a little bit of science shouldn't kill nobody...

Rourke

  • Member
  • *
  • Posts: 39
  • Gender: Male
  • Roulette Forum .cc | Member
    • View Profile
  • Rated: 0
Re: Markov Chains explained
« Reply #2 on: March 22, 2016, 05:34:03 PM »
Very interesting post Vladir :-)

However... I don't think you can apply the following to roulette:

Quote
We can minic this "stickyness" with a two-state Markov chain. When the Markov chain is in state "R", it has a 0.9 probability of staying put and a 0.1 chance of leaving for the "S" state. Likewise, "S" state has 0.9 probability of staying put and a 0.1 chance of transitioning to the "R" state.



Share via delicious Share via digg Share via facebook Share via furl Share via linkedin Share via myspace Share via reddit Share via stumble Share via technorati Share via twitter

xx
Probabilistic Graphical Models (inc. Bayesian & Markov networks) free course

Started by monaco

0 Replies
1066 Views
Last post July 02, 2012, 09:10:19 AM
by monaco
xx
Recruitment Explained

Started by PeaBea65

0 Replies
232 Views
Last post August 10, 2016, 06:43:27 AM
by PeaBea65
xx
Well explained truth

Started by Steve

0 Replies
689 Views
Last post April 03, 2012, 03:04:24 AM
by Steve
xx
Oscar's Grind Explained

Started by Playborne

5 Replies
3708 Views
Last post June 22, 2011, 12:15:32 PM
by Hermes