1 parrondo's paradox. 2 two losing games can be combined to make a winning game. game a:...
TRANSCRIPT
1
Parrondo's Paradox
2
Two losing games can be combined to make a winning game.
Game A: repeatedly flip a biased coin (coin a) that comes up head with probability pa<1/2 and tails with probability 1-pa. You win a dollar if the head appears.
3
Game B: repeatedly flip coins, but the coin that is flipped depends on the previous outcomes.
Let w be the number of your wins so far and l the number of your losses.
Each round we bet one dollar, so w-l represents winnings: if it is negative, you have lost money.
4
Here we use two biased coins (coin b and coin c).
If 3|w-l, then flip coin b, which comes up head with proabability pb.
Otherwise flip coin c, which comes up head with probability pc.
Again, you win one dollar, if it comes up head.
5
6
Suppose pb=0.09 and pc=0.74. If we use coin b for 1/3 of the time that the winnin
gs are a multiple of 3 and use coin c the other 2/3 of the time.
The probability of winning is
.2
1
300
157
100
74
3
2
100
9
3
1w
7
But coin b may not be used 1/3 of the time!
Intuitively, when starting with winning 0, use coin b and most likely lose, after which use coin c and most likely win. Thus, you may spend lots of time going back and forth between having lost one dollar and breaking even before either winning one dollar or losing two dollars.
8
Suppose we start playing Game B when the winning is 0, continuing until either lose three dollars or win three dollars. Note that if you are more likely to lose than win in this case, by symmetry you are more likely to lose 3 dollars
than win 3 dollars whenever 3|w-l.
Consider the Markov chain on the state space consisting of the integers {-3,-2,-1,0,1,2,3}.
We show that it is more likely to reach -3 than 3.
9
Let zi be the probability that the game will reach -3 before reaching 3 when starting with winning i.
We want to calculate zi, for i=-3,..,3, especially z0.
z0>1/2 means it is more likely to lose three dollars before winning three dollars starting from 0.
10
We have the following equations:
Solving this system of equations, we have
z-2 = (1-pc)z-3+pcz-1,z-1 = (1-pc)z-2+pcz0,z0 = (1-pb)z-1+pbz1,z1 = (1-pc)z0+pcz2,z2 = (1-pc)z1+pcz3,
555.0)1)(1(
)1)(1(22
2
0
cbcb
cb
pppp
ppz
11
Instead of solving the equations directly, there is a simpler way to determine the relating probability of reaching -3 or 3 first.
Consider any sequence of moves that starts at 0 and ends at 3 before reaching -3.
E.g. s=0121210-1-2-1012123.
12
Create a 1-1 and onto mapping of such sequences with the sequence that start at 0 and end at -3 before 3 by negating every number starting from the last 0 in the sequence.
Eg. f(s)=0121210-1-2-10-1-2-1-2-3.
13
Lemma: For any sequence s of moves that starts at 0 and ends at 3 before -3, we have
Pf:
.)1)(1(])(Pr[
]Pr[2
2
cb
cb
pp
pp
occurssf
occurss
t1: # of transitions from 0 to 1 in s;t2: # of transitions from 0 to -1 in s;t3: the sum of the number of transitions from -2 to -1, -1 to 0, 1 to 2 and 2 to 3;t4: the sum of the number of transitions from 2 to 1, 1 to 0, -1 to -2 and -2 to -3.
14
Then, the probability that the sequence s occurs is
Transform s to f(s). Then (01) is changed to (0-1).
After this point, in s the total number of transitions that move up 1 is 2 more than the number of transitions that move down 1, since in s it ends at 3.
In f(s), the total number of transitions that move down 1 is two more than that of moving up 1.
?.)1()1( 4321 whypppp tc
tc
tb
tb
15
It follows the probability f(s) occurs is
Thus,
.)1()1( 2211 4321 tc
tc
tb
tb pppp
.)1)(1(])(Pr[
]Pr[2
2
cb
cb
pp
pp
occurssf
occurss
16
Let S be the set of all sequences of moves that start at 0 and end at 3 before -3.
Which is 12321/15379 < 1.
I.e. it is move likely to lose than win.
2
2
)1)(1(])(Pr[
]Pr[
]33Pr[
]33Pr[
cb
cb
Ss
Ss
pp
pp
occurssf
occurss
beforesreachedis
beforesreachedis
17
Another way of showing Game B is losing.
Consider the Markov chain on states {0,1,2}, which
indicates (w-l) mod 3. Let i be the stationary probability of this chain. The probability that we win a dollar in the stationary
distribution, is
Want to know if the above is >1/2 or <1/2.
pb0 + pc1 + pc2
= pb0 + pc(1-0)= pc – (pc-pb)0.
18
The equations for the stationary distribution are:
Plugging pb and pc, 0 0.3826 and pc-(pc-pb)0<1/2.
Again, a losing game..
0 + 1 + 2 = 1pb0 + (1-pc)2 = 1
pc1 + (1-pb)0 = 2
pc2 + (1-pc)1 = 0
.223
12
2
0ccbbc
cc
ppppp
pp
19
Combining Game A and B
Game C: Repeatedly perform the following:
It seems Game C is a losing game.
Start by flipping a fair coin d
if d comes out head then proceed as in Game A.
if tail, then proceed to Game B.
20
Analysis:
If 3 | (w-l), then you win with probability
else win with
By the previous lemma,
i.e., Game C appears to be a winning game!
;2
1
2
1bab ppp
.2
1
2
1cac ppp
,1)1)(1(
)(2
2
cb
cb
pp
pp
21
Analysis:
It seems odd, we check with the Markov chain approach.
Then by plugging the values.
If the winnings from a round of game A, B and C are XA, XB and XC (respectively), then it seems that
But E[XA]<0, E[XB]<0, how can E[XC]>0?
2
1)( 0 bcc ppp
],[2
1][
2
1]
2
1
2
1[][ BABAC XEXEXXEXE
22
Reason: the above equation is wrong! Why? Consider the above mentioned Markov chain on
{0,1,2} for games B and C.
Let s represent the current state, we have
].[2
1][
2
1
]2
1
2
1[][
sXEsXE
sXXEsXE
BA
BAC
23
Linearity holds for any given step; but we must condition on the current state.
Combining the games we’ve changes how often the chain spends in each state, allowing two losing games to become a winning game!