I'm sure this has been discussed here before but for those that haven't seen it, I just came across a simple formula that represents what the Kelly system claims to be your ideal bet size. Here it is:
(C*P - 1)/(C-1)
where,
C- is the coefficient representing your payout amount
P- is how likely your bet will hit (between 0 and 1)
So C for a standard 11-to-10 bet is 1.909090... (It's the number that you multiply your bet amount by to get your *total* amount returned)
Example#1: If you think you can consistently pick 55% winners in the NBA you should bet (1.9090909*0.55-1)/(1.9090909-1) or 5.5% of your bankroll on each bet (assuming you are laying 110 to win 100)
Example#2: If you have an underdog at +400 that you think has a 25% chance of hitting, you should bet (5*0.25-1)/(5-1) or 6.25% of your bankroll on it.
I want to start exploring how to figure out chances of losing a given amount of your bankroll if you use this strategy. Anyone have any ideas on how to approach this deductively? (i.e. without having to resort to simulations?)
(C*P - 1)/(C-1)
where,
C- is the coefficient representing your payout amount
P- is how likely your bet will hit (between 0 and 1)
So C for a standard 11-to-10 bet is 1.909090... (It's the number that you multiply your bet amount by to get your *total* amount returned)
Example#1: If you think you can consistently pick 55% winners in the NBA you should bet (1.9090909*0.55-1)/(1.9090909-1) or 5.5% of your bankroll on each bet (assuming you are laying 110 to win 100)
Example#2: If you have an underdog at +400 that you think has a 25% chance of hitting, you should bet (5*0.25-1)/(5-1) or 6.25% of your bankroll on it.
I want to start exploring how to figure out chances of losing a given amount of your bankroll if you use this strategy. Anyone have any ideas on how to approach this deductively? (i.e. without having to resort to simulations?)