2

WELCOME GUEST! JOIN THE #1 ROULETTE FORUM TO SHARE AND DISCUSS ROULETTE SYSTEMS. ALL ADS ARE REMOVED AFTER LOGIN. REGISTER HERE

Author Topic: Markov Chains explained  (Read 626 times)

0 Members and 1 Guest are viewing this topic.

vladir

  • 250+ posts Member
  • ****
  • Posts: 453
    • View Profile
  • Rated: 0
Markov Chains explained
« on: March 22, 2016, 11:07:54 AM »
"In God we trust; all others must bring data", W. Edwards Deming


NextYear

  • 500+ posts Member
  • *****
  • Posts: 711
  • Gender: Male
  • Roulette Forum .cc | Member
    • View Profile
  • Rated: +14
Re: Markov Chains explained
« Reply #1 on: March 22, 2016, 02:24:23 PM »
Thanks, a little bit of science shouldn't kill nobody...

Rourke

  • Member
  • *
  • Posts: 39
  • Gender: Male
  • Roulette Forum .cc | Member
    • View Profile
  • Rated: 0
Re: Markov Chains explained
« Reply #2 on: March 22, 2016, 05:34:03 PM »
Very interesting post Vladir :-)

However... I don't think you can apply the following to roulette:

Quote
We can minic this "stickyness" with a two-state Markov chain. When the Markov chain is in state "R", it has a 0.9 probability of staying put and a 0.1 chance of leaving for the "S" state. Likewise, "S" state has 0.9 probability of staying put and a 0.1 chance of transitioning to the "R" state.



Share via delicious Share via digg Share via facebook Share via furl Share via linkedin Share via myspace Share via reddit Share via stumble Share via technorati Share via twitter

xx
Probabilistic Graphical Models (inc. Bayesian & Markov networks) free course

Started by monaco

0 Replies
1079 Views
Last post July 02, 2012, 09:10:19 AM
by monaco
xx
Well explained truth

Started by Steve

0 Replies
717 Views
Last post April 03, 2012, 03:04:24 AM
by Steve
xx
Recruitment Explained

Started by PeaBea65

0 Replies
278 Views
Last post August 10, 2016, 06:43:27 AM
by PeaBea65
xx
The Tax System Explained In Beer

Started by MrJ

1 Replies
1069 Views
Last post January 18, 2012, 02:59:09 AM
by Wally Gator