Here is a statement of the Gambler's Ruin Problem.
Let A be your amount of money, and B be how much money the house has.
Let p be your probability of winning 1 game, and q=1-p.
If p=q=0.5, then Pr(of you going broke)=(B/(A+B))
If p<>p and p<>0 and q<>0, then Pr(of you going
broke)=(1-(q/p)^B)/(1-(q/p)^(A+B))
If A=0 and B>0 then Pr(of you going broke)=1
BUT...
if A>0 and B>0 and p=1, doesn't the formula fail? If p=1, then
Pr=(1-(q/p)^B)/(1-(q/p)^(A+B))=1
which implies that if your chances of winning 1 game are 100%, then the
formula predicts that
Pr(of you going broke)=100%.
I'm thinking of writing a program for Gambler's Ruin, but I'm wondering,
how would you
define the value of Pr for
p=1, A=0, B=0 or p=0, A=0, B=0, or p=0.5, A=0, B=0?
I ask because in the above cases, A=0 which means you're broke,
but B=0 so, the house is broke too. Would that mean that
Pr=1 because your broke at the start regardless of what
the house has?
How about this one.
p=1, A=10, B=0?
--
Patrick D. Rockwell
prockwell@thegrid.net
hnhc85a@prodigy.net
patri48975@aol.com
|
|