It's Math Puzzle Friday! Yay!

We gave this problem to the students at Math Night on Wednesday. It appears in Peter Winkler's book Mathematical Puzzles: A connoisseur's Collection (on page 67) and is stated as follows:

See if you can solve it.

If you don't know where to start, try to solve it for a deck of 4 cards (2 black/2 red). What about 6 cards (3 black/3 red)?

We gave this problem to the students at Math Night on Wednesday. It appears in Peter Winkler's book Mathematical Puzzles: A connoisseur's Collection (on page 67) and is stated as follows:

See if you can solve it.

If you don't know where to start, try to solve it for a deck of 4 cards (2 black/2 red). What about 6 cards (3 black/3 red)?

**Problem 2:**What's the maximum amount he can assure himself of winning for a deck of 2n cards (with n black and n red)?
As far as I can tell, the best strategy is to not bet on the first card (WLOG, assume this card is red), then to bet 3 cents on every card afterward being black, up until a certain point. As for what this point is...

After 50 cards have been played, the worst-case scenario is that 25 of each color card have been played (if 26 of either color has been played, he doubles his money on each of the final two cards). In that scenario, he's guessed correctly 25/49 times, for a net profit of 3 cents, leaving him at $1.03. He can then wait for the final card, for a guaranteed $2.06.

I

thinkthis is optimal, but I'm not sure. I chose 3 cents as his wager amount, as 4 cents or more would allow him to run out of money with 26 straight red flips to start the game. He can't guarantee more money by choosing whatever color is most likely, as up until 26 of one color has been played, there's always a chance of him guessing wrong and ending up with less than $1 to bet on the last card. In fact, I would think that the fact that any guesscanbe wrong automatically eliminates any strategy that doesn't have every wager planned out before any cards have been flipped (up until 26 of one color have been played, that is).Whoops, didn't notice problem 2...if my solution to part 1 was correct, his best strategy is to bet ceiling(100/(n-1))-1 cents on each card after the first (with the same restrictions as in my solution to problem 1), netting him a minimum of 2*(99 + ceiling(100/(n-1))) cents.

Ughhh, triple post, I'm such a spammer.

If n > 100, his best strategy is probably just to bet it all on the last card and get $2. At the very least, my strategy no longer works because he can run out of money no matter how much he bets on each card.

No, if n > 100, then just pass on the first (n-100) cards, tracking the colors of the cards drawn. Once n <= 100, then apply the standard strategy.

Simple solution: Wait until there is only 1 card left of one colour (and m>=2 of the other). Bet, say, $.4 to the dominant colour (m cards). Losing leaves you with $.6 (which you can double m times to 2^m*$.6) and winning $1.40 (which you can double guaranteedly once to $2.8 in the end).

Not the optimal solution, since the first betting amount should obviously depend on m if we want to maximize WCS. And if the first bet come out as expected, we're playing again with m-1 versus 1 cards.

Continuing: worst case m is 2. Let's bet on the dominant colour for $1/3. If this is lost, there is only one color left with two cards, so outcome will be 2*2*$2/3 = $8/3. If we win the first bet we have $4/3, which we can double in the last card to $8/3.

Thus, using this strategy, for n>=2 there is a guaranteed $8/3 in the end. Possibly even optimal since we can't expect to get m>2.

What I would do is:

wait until there is no card left from one colour. After that I can bet everything on the other. In that case the lowest amount of my money on the end will be between 2$ (I could play only the last card) and 2^n$ (if I am very lucky and the first n cards are from the same colour).

By the way: there are four colours in a play card deck, aren't there? (spades, hearts, diamonds and clubs) In that case my stategy produceses much less profit ^^

Not quite. There are 4 suits (your list) divided into 2 colors (spade and clubs ae "black", hearts and diamonds are "red").

OT note: in poker, the phrase "all blue" is often used to indicate a hand of either all spades or all clubs (ie a black suit flush).

The solution I have reached for a deck of size 2n is:

Product[i=1..n : 2i/(2i-1)]

This leads to the following solutions: 2/1, 8/3, 16/5, 128/35, 256/63, 1024/231, ...

I didn't bother to find a closed formula.

For the more general case (m black, n red) the solution is:

f(0,n) = f(n,0) = 2^n

f(n,m)=2*f(n-1,m)*f(n, m-1)/[f(n-1,m) + f(n, m-1)]

2^(2n-1) / C(2n-1,n) (for a (n,n) situation)

That's the answer I get too...based on making a bet of 2p-1, where p is the probability of correctly guessing the next card, on each opportunity. I don't have a full proof yet, though.

I can always win at the end: (2n)!!/(2n-1)!! where n is quantity of the cards of more numerous color. ( If there are 34 black a 31 red, then n=34).

Condition is that you have to always bet (M-N)/(M+N) of your amount on the more probable outcome, until there are only cards of one color. Then You always bet everything. M is number of cards with dominant color, N is number of cards of the other color.

Have a look :)

http://www.wolframalpha.com/input/?i=%282n%29!!%2F%282n-1%29!!

It's interesting that this matches the Kelly Criterion (http://en.wikipedia.org/wiki/Kelly_criterion), although that is to do with maximising your expected income and this is to do with maximising your guaranteed income.

Because this situation ends with some number of guaranteed bets, using the Kelly criterion ensures that minimum income = expected income = maximum income.

That was my thought too--but I haven't finished the analysis. By the way, you don't need the second rule (about all of one color) since the formula (M-N)/(M+N) gives one whenever N=0, so you automatically bet everything.

The one thing I'm not sure of is how the quantized bankroll affects things. That is, how exactly do you bet 3/29ths of $3.14, or whatever you end up with. Do you round to the nearest cent? How does it affect the final outcome?

My answer would be a minimum of $2.66

I would wait until there was only one card of one color left. At worst case, there would be 2 of one color, one of the other. if you bet wrong you now have 100-n which you will double up on the remaining to cards giving you an amount of 4(100-n). If you bet correctly on the first bet, you sit out the next card and double up on the last card giving you an amount of 2(100+n). so to maximize your guaranteed winnings, we need to find n where 4(100-n)=2(100+n)

which works out to n being 33 1/3, Assumming you can only bet in penny increments, you would bet 33 cents of the colour that wasn't the single card left. if you lost you'd have $.67 which you would double twice, since the last 2 cards are the same color for $2.68, If you win the first bet, you have $1.33 which you pass on the second card and double on the last for a total of $2.66. So $2.66 is the least you can win.

He can double his money on each card by betting that it isn't purple. So for a deck of n cards, he can win 2^n-1 dollars (because he didn't *win* his first dollar; he brought it with him).

He can win even more if Paula allows him to bet with improper fractions! Paula, you FOOL!

And that's (2^n)-1, not 2^(n-1), just to be *absolutely clear*.

Let's say x is the money the player has left, n is the number of cards remaining and c is the number of cards left of the more frequent color. The optimal bet in this case is x*(2c/n - 1) and the resulting money after the game has been played out completely will be x * 2^n / (n choose c).

I have reached this result through induction, but it would be tedious to type it out here :)

Well, it says "

guaranteed" so even if you had a highly probable chance of getting it right, it still is not guaranteed, and so the answer would still be $2 in the worst deck shuffling cases.That sounds intuitive, but it's wrong. Say you wait till there are 3 cards left, and bet 33 cents on whichever there are more of. If you win, you have 1.33 so you skip the even money bet and bet it all on the last card, winding up with 2.66. If you lose, you only have 67 cents left, but you know which color the last 2 cards are, so you can double your money each time, widning up with 2.68.

It gets more complicated with more cards, but the basic strategy is the same: don't bet on even money bets, always bet the amount such that you end up with the same whether you win or lose the bet.

He could count the number of cards of a specific color, when that number reaches 26, or there are x number of cards left where x is equal to 26 - his count (i.e. all of the cards left are the color that he is counting) He could double up that many times. Unless the deck ends with alternating colors, he could get more then $2 just about every time.

Go go gadget trivial case!

For 2 balls: skip 1 bet 1, = 2

For 4 balls, skip 1, bet 1/3: win: 4/3, skip 1, bet all = 8/3 = 2.66; lose: 2/3, bet twice all, = 8/3

For 6 balls, skip 1, bet 1/5: win 6/5, cont 4 balls: 8/3*6/5 = 16/5

lose: 4/5, bet 2/5 [win: 6/5 etc 4 balls), lose: 2/5, bet 3 times = 16/5]

8 balls, skip 1, bet 1/7, win: 8/7 * 6/5 * 4/3 * 2, lose: bet 2/7 etc.

x balls, skip 1, bet 1/(x-1), etc.

Formula: product[2n/(2n-1)] n = 1 to x/2

x=52: product[2n/(2n-1)] n=1 to 26 and this equals $9.08

It is less if you can bet only whole cents: about $8.36

Proof : first, let f(a,b) the maximum we can guarentee with a red cards and b black cards. For fixed n and m, let b the optimal bet on "the card is red" (b may be negative). b has to be such that (1+b)f(n-1,m) = (1-b)f(n,m-1) = f(n,m). From that, we have the formula from Alon

(2*f(n-1,m)*f(n, m-1)/[f(n-1,m) + f(n, m-1)]).

We want to prove that the optimal bet is always b=2p-1 (where p is the probability that the card is red, which is n/(m+n) ). That mean that

p*f(n-1,m)=(1-p)*f(n,m-1), or n*f(n-1,m)=m*f(n,m-1). Let proceed by induction on n+m.

By induction, we have f(n-2,m)=m/(n-1) *f(n-1,m-1) and idem. Let x = f(n-1,m-1). Then, f(n-1,m) = [2*(m/(n-1))*x²] / [(n+m-1)/(n-1))*x]

= 2(m/(n+m-1))x. The result follow.

Using that, we have f(n,n) = f(n,n-1) = (1+b)*f(n-1,n-1)

= [2(n+1)/(2n-1)] f(n-1,n-1), hence

f(n,n) = Product[i=1..n : 2i/(2i-1)] = 2^(2n-1) / C(2n-1,n)

Maybe we need an answer to these puzzles. There's a lot of differing answers.

As mentioned by others, for 4 cards the best you can do is $8/3. For 6 cards it is $16/5.

This gives me hope that my calculations are not wrong!!!!!

I see someone else already noted the improper fractions method, then the limit is infinite dollars, by waiting until the final card. :)

I'm unclear how you guarantee yourself to finish with more than $4 though.

With 2 cards it's simple, wait for last card, bet it all, $2.

With 4 cards, the deck will either be BBRR, BRBR, BRRB. Order won't matter since after the first card is revealed, there will be an optimal strategy. You can miss the first 3 in 2/3 of those scenarios, so don't you have to wait until there is a guarantee -- last card in the final 2 series and second to last card in the first one? So you "could" guarantee yourself doubling up twice, but it is NOT guaranteed that you will do so -- you're only TOTALLY guaranteed to double up once, on the last card, no matter how many cards there are.

The total you COULD guarantee yourself AT a certain position in the game would be to have all X cards left of one color, doubling up every time. Then the 2^n-1 makes sense.

In general, if there are R red cards and B black cards left, You should bet (R-B)/(R+B) fraction of your money that the next card is Red. If this is negative, that means you bet on black.

One can then check by induction that the payout P(R,B) will be (2^(R+B))/( (R+B) choose B ).

Another induction verifies this is optimal.

Nice puzzle, keep it up.

Nice puzzle, keep it up.

My solution so far is:

Payout P(R,B):=

2/(1/P(R-1,B) + 1/P(R,B-1)) for R>0,B>0;

2^R for R>0,B=0;

2^B for R=0,B>0;

1 for R=0, B=0;

Excel can solve that in an ease, the solution for 26 Black and 26 red cards is $9.08.

what if you cannot bet fractions of cents?

I will post my solution over at the Forum. Suffice here to say that, using Fingolfin's optimal bet, Victor should go home with something like $57.70.

Sorry for the double post.

ANswer to question 2:

The maximum assured amount Victor should go home with with a deck of 2n cards (n of each of two colors) is:

(2^2n) x n! x n! / (2n)!

As said, for n = 26 (standard poker/bridge deck), the value is about $57.70.

I will post my solution over at the Forum. Suffice here to say that, using Fingolfin's optimal bet, Victor should go home with something like $57.70.

For 52 balls and starting with 100 cents.

Optimal bet is: abs( capital*(red-white)/(red+white) )

Theoretical win is: product[2n/(2n-1)] n = 1 to 26: 908

If you can only bet whole number of cents: optimal bet rounded down

Minimal win (skip when red=white, then win, loop): 836

Maximal win (skip, lose 25 times, then win 26 times: 67108864

100 simulations with random win/lose based on number of red/white:

resulting values between 848 and 976

Woot! I actually figured one out. I tried it with 4 cards and 6 cards by hand, and it worked. It seems like if you bet the expected value of your winnings had you bet all your money (on whichever card is more likely or 0 if neither is more likely), you will end up with the same amount of money at the end of the game no matter what combination of cards you get (if fractional cents are allowed). This somewhat makes sense because you should bet based on how sure you are that you will win a particular deal. I went through the game by hand for 52 cards (not allowing fractional cents just rounding to the nearest cent), and it seems like he should win somewhere in the neighborhood of $9.

Just to generalize a bit, the formula for the amount of money you can definitely walk away with, assuming there are N black cards and M red cards to start with, is 2^(N+M) / C(N+M,N)

Interesting to note is that if N=M, the formula becomes 2^(2N) / C(2N,N) and since you shouldn't bet on the first card, that comes out the same as 2^(2N-1) / C(2N-1,N)

Possible solution in the forums. As pointed out by one of the readers, starting with n black and n red cards the most amount of money you can guarantee to win is asymptotic to \sqrt{pi*n}.

What about betting one cent, and then doubling each bet? I know people do that on roulette, and it guarantees a profit if you can double as many times as you want.

The answer, which can be proven by induction, is that for n cards out of which k are black, the highest guaranteed profit will be

(2^n)/(n Choose k) (assuming the initial 1$ also counts as profit).

An interesting fact is that this equals 2^(number of cards) / (the number of different "paths" the game can take)

When a "path" is a series of k black cards and n-k red cards in some order. I haven'y yet found a good explanation for this, but if one does find an explanation it might provide a better proof for this result.

The case of n=2k gives (2k)!!/(2k-1)!!.

Wait a second, I've seen a number of people claim that the minimum you can walk away from the 4 card deck case is 8/3$ but I can do better than that. See the first card. For the second card, bet 1/4$ on the color you just saw. If you are correct, you double up twice and leave with 5$ (1.25 -> 2.5 -> 5). If incorrect, you will still get to double up twice with 3/4$ left, giving you 3$ (.75 -> 1.5 -> 3). Haven't had a chance to actually think about other cases, but just thought I'd mention this.

Er, whoops, hadn't thought about it enough, clearly. Forgot that a successful bet on the second card would leave you with a 50/50 shot at the next one, which is not ideal. Sorry, that's what I get for trying to do stuff like this in the early morning :P