• Welcome to #1 Roulette Forum & Message Board | www.RouletteForum.cc.

News:

Test the accuracy of your method to predict the winning number. If it works, then your system works. But tests over a few hundred spins tell you nothing.

Main Menu
Popular pages:

Roulette System

The Roulette Systems That Really Work

Roulette Computers

Hidden Electronics That Predict Spins

Roulette Strategy

Why Roulette Betting Strategies Lose

Roulette System

The Honest Live Online Roulette Casinos

Frequency of an event

Started by TRD, May 20, 08:50 AM 2022

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

TRD

Is on the forum any willing mather (or having a great grasp of math on the roulette side of things)
prepared to take the matter at hand further, allowing me:

•  add smth, you deeming me thereafter having an even better comprehension
   of things specified & more, facilitating an even smoother application of it

•  demonstrate the way of calculating the frequency of an event,
   keeping in mind that its not constant, thereof precisely calculate
   the 'worst' or most intense frequency of such an event being =1/q

  .... so we are looking at getting some type of 'interval of frequency'
  (very much the confidence interval results' format alike)

TRD

To begin with, before even applying MoneyT101's YN groups bs concept
(which by itself will definitely improve the avg & overall performance parameters anyway);

I've included here the detailed extract of stats of an all-winning method,
which has some very rare games going beyond the desirely acceptable
performance parameters .. those being:


(eg. -100u max drawdown + ≈100 spins max individual game to nominal profit length,
or  -150u max drawdown + ≈100-150 spins max individual game to nominal profit length;
with bankroll vs session +100u profitable session ratio, nearly maximizing simultaneously
both, the base unit increase & thus compounding effect)


.. so for the purposes of generating an insight & assessment, I am playing/toying
with the idea of the stop-loss introduction = rather than driving those rare individual
games over +200 spins into the nominal profit, terminate the session based on the
fundamental premise of .. in short:


my method is so strong that I can afford to use Marti on it


or in a bit longer version.. my 'base sequence' (eg. up to 150u bankroll + ≈150 spins till termination = event in itself) is so great, with the extended bottlenecks games being so rare & far apart, that those can (potentially) be terminated as 'non-completes' -- & thereof the base unit Marti takes care of it ..




TRD

As you can see in this limited stats base, I've played/included:

3533 spins
555 games
11 sessions


.. where a session is a 'gamechain' .. of individual games till inevitable profit;
•  it takes on average 50 games to form a session, ranging somewhere from 70 till 40,
   even below that to 30 & less on faster games
•  it takes on average about ≈300 spins to form a session, here specifically averaged to 321

made +1037u,
or 0.2935u/spin, 1.8685u/game


The last session, due to the extreme game & the sheer amount of spins has been terminated early ..(before the standard session goal of +100u, although if continued for another ≈150-200 spins this such is driven to +100u too) .. so basically just above session break-even .. with nominal session profit;

[which, btw, is some members' on this site a definition of a hg .. win or break-even --not mine though, I am a perfectionist]

TRD

Moreover, I've input all games played into the sheet, having two attributes --
individual game's .. max exposition amount, length (till new high, or profit);

then I've sorted all the games into two arrays categorized by either exposition or length,
broken down into multiple groups criteria for each --

ie.
max exposition, units | -1 till 9, -10 till -19, -20 till -29, .. etc. .., -100 till -150, -200 till -300, -300+
length, spins | 1 till 10, 11 till 20, 20 till 30, .. etc. .., 101 tiil -150, 151 till 200, 200 till 300, 300+


.. thereof got the number of games per each criterion, & the ratio of the criteria versus total games played .. or simply percentage

TRD

Hereof, let's say we trim the bottlenecks off ..
hypothetically applying only 150u bankroll.



The general stats change to:
(columns BC)

+923u
3343 spins
555 games    ..    1x game terminated
11 sessions   ..    1x session terminated

The terminated session already pocketed 43u (which are never re-staked, playing only with the allocated/designated bankroll .. in the big picture, contributing to the base unit increase & compounding effect) --

thus (+43 - 150=) resulting in -107u -- to be recovered.

TRD

Bringing us to the above-mentioned recently delving into the math side if things:


p = 1 - q = 0.9982
q = 1 - p = 0.0018
----------------------------
inv.p = 1/p = totality/p = 1/0.9982 = 10000/9982 =  1.00180224584
inv.q = 1/q = totality/q = 1/0.0018 = 10000/18 = 555.5~
[1x non-complete every 555 games]
------------------------------------------------------------------
var = p x q = 0.9982*0.0018= 0.00179676
n = number of games, at avg ≈50 games/session = 10000
[at avg ≈50 games/session ≈ 20 session]
---------------------------------------------------------------------
mean success = n x p = 150*0.9982= 998.2
[expected .. succes 998 in 1000 games played, or
9 in 10 sessions, non-complete 1 in 10 sessions]

variance for number of trials = n x var = 150*0.00179667= 1.7976
sigma, Σ = √(n x var) = √(150*0.00179676)= 1.340432766
std.error, se = 2 x sigma = 2 x 0.5191473= 2.680865532
-------------------------------------------------------------------------------------------
conf.int = p ± magnitude x Σ = p ± number of nines x se =
2 nines = 99.45 = se                           2.68086553                       
3 nines = 99.75                                   4.02129823
4 nines = 99.9936                 99.82 ±5.36173106
6 nines = 99.99994                             6.70216383
9 nines = 99.99999998                      8.04259660


p  probability of an event happening successfully, reliability (=complete)         
q  probability of an event not happening (=non-complete)
inv.p  inversed success probability, expected number of occurrences in which
            success is to be seen
inv.q  inversed non-success probability, expected number of event instances
            non-success is to be seen
var  variance, a measure of dispersion, or far a set of numbers (data) is spread
         out from their average value (means the expectation is only ideal)
n  instances of an event (can be a spin, game, session; here game)
mean success average rate of success
variance for number of trials  a measure of dispersion within the specified number
                                                         of events
sigma, std.dev  is a measure of the amount of variation or dispersion of
                              a set of values
se  standard error, an estimate of how far the sample mean is likely to be from
                                    the population mean
conf.int  confidence interval, tells the possible range around the estimate & also
                 about how stable the estimate is .. meaning one that would be close to
                 the same value play was repeated.

TRD

REQUEST

1.  Add more to the definitions --
if you deem smth will allow me to have a better comprehension of the dynamic between
the bold figures; secondly, at eg. variance within the number of trial decimal value .. what
does that actually tell me in the context, practically speaking, or its simply just used as a
factor in further calculation holding no other significance.


2.  Moreover, & most importantly, given that the confidence interval is a vector + the
amplitude is the angle of the vector .. & we are looking for the frequency of the 1/q events
..

(in other words, practically -- the application of Marti, over/on the 'base sequence' =
game ← bottleneck terminated as a non-complete of an event)

.. & since in roulette neither the amplitude nor frequency is constant .. specifically,
we are looking for the highest intensity of such frequency telling us essentially
either

•  how many 1/q events in a row we can expect max .. in the desired interval of games
played, thereof the most likely frequency of these 2x, 3x, (4x?) in the row event appearing
•  or, taking another angle, given the resulting frequency of 2x in row event is low enough,
    what would the required 'play bankroll' amount be, ascertained from the fundamentally
   required P (see image, top left array) .. which satisfies the premise of the one-step Marti
   takes care of the most intense frequency manifested (potentially extended max to two-step)


what would the formula to get to this value be ?


Toimeme

Il faut avoir un prix nobel de maths appliqués pour comprendre ce fil

Blueprint

You have a drawdown.  You may want to reconsider everything you're doing.  Cheers.

TRD

@Blueprint;

debt creation, or drawdown, is inherent to roulette play -- from the point of view of an individual game to profit .. the first no-hit bet already creates an outstanding debt to be resolved (=essentially, 1x no-hit = recovery) .. so basically its unavoidable←inherent;

what's important are essentially two things -- interacting with variance ..
•  how much drawdown the method self-creates (exposition increase rate, volatility, protrusion rate = is drawdown minimized & at any point recoverable, requiring only the shortest turn of variance in favor)
•  how, or in what manner (which tools), is outstanding debt deal with -- does it get ALWAYS recovered (the answer is YES)

+

Regarding that I mostly play 2Q as a wide bet (1st hit) & a single ST as focused (combo hit)
in those ≈5% of the upper games with higher exposition & longer length (top left array) ..
the drawdown actually ain't that big at all.

Especially when only  2% go over (-49),  0.36% over (-99), &  0.18% over (-150).

So, the dickhead says .. NO.



======================



@toimeme .. & I need a translator to decode what you said; English, thanks.
Nonetheless, it really ain't that hard to comprehend = the frequency fluctuation of 1/q events â†'
telling how often 1/q happens in a row .. when the frequency is at its most intense

Quote
@toimeme .. & j'ai besoin d'un traducteur pour décoder ce que vous avez dit; Anglais, merci.
Néanmoins, ce n'est vraiment pas si difficile à comprendre = la fluctuation de fréquence des événements 1/q â†' dire combien de fois 1/q se produit d'affilée .. quand la fréquence est la plus intense

Blueprint

We obviously play very different games.  Good luck in your quest!

TRD

You sound like casino dealer.

TRD

For those who find it useful, the above screenshot sheet;

manual input -- row 31â†", columns BCDE & Hâ†'
& on the second tab .. cells D5 (map to first tab column E, in the row of the bankroll amount)
                                                D35 (number of event = games)

TRD

& I do not rely on luck.

Blueprint

Quote from: TRD on May 20, 03:49 PM 2022
You sound like casino dealer.

You sound like Falkor.

-