Wednesday, April 9, 2014

The Magic Coin Game

Weirdness abounds in probability (and I mean the kind of probability that can be understood by regular folk, not the kind that involves things like measure theory. That kind of probability also contains weirdness, but it is only understood by mathozoids who I imagine use some sort of proboscis to acquire nutrients when the rest of us aren't looking).

Perhaps the most famous of all probability weirdness is the Monty Hall problem, named after the eponymous host of the '70s game show in which it appeared. Briefly: you are shown three doors. Behind one door is a good thing (a car, money, Claire Danes' underpants). Behind the other two doors are bad things (a goat, a beating, Hitler's underpants). You pick one of the three doors. Monty shows what's behind one of the doors you didn't pick. You must decide to stick with your original selection or switch to the door Monty didn't show you. Monty then opens your door and shows you what you have won.

The famous result is that switching improves your chance of getting the good thing. That seems redonkulous, but careful analysis shows it to be true.

Explanations of the Monty Hall problem have turned up everywhere from AskReddit to Parade Magazine. If you are holding a probability textbook (and why wouldn't you be?) it's probably mentioned in it somewhere. I won't discuss the solution here (see: the Google avalanche returned by searching on "monty hall problem"). Rather, the MHP got me thinking about other odd fish that lurk in the dark sea called probability. Other weirdnesses proffered by Lady Luck, her strapless evening gown and feminine jowels calling us forth, seeking to separate us from nous and paycheck.

I am fortune's fool! Romeo exclaimed. Yes you are, Mr. DiCapricorn. And you better keep your paws off Claire or there's going to be trouble.

I call this weirdness: The Magic Coin Game.



A standard coin, when tossed, has a probability, p, of landing heads equal to 1/2. Not so the magic coin. A number is selected at random between 0 and 1. That number is p for the magic coin. We ask: how many flips do we expect it will take to get a heads with the magic coin?

The answer is: infinity. Infinity flips.

A result akin to that of the Monty Hall problem which defiles common sense. But a result nonetheless true, as I will show in a moment.

The language I used to describe the game is a little stilted (how many flips do we "expect"??) so let me recast things in the plain tongue of gambling. Suppose each flip costs you $1, and throwing a head wins you a sum of money which we will call the "payoff." The payoff has to be large enough to entice you to play -- say $10. Infinite expectation means that -- if we repeat the game enough times -- you will go bankrupt. Guaranteed. No matter how much money you start with. No matter what the payoff. I will always get all of your money eventually.

This all seems counterintuitive (well, it did to me). For comparison, with a normal coin having p = 1/2, we expect, on average, to have to throw twice to get a head. So if the payoff was $2, you should expect to break even. But with the magic coin, I always come out ahead if we play enough games, even if you come out ahead on most of the games. (It goes without saying that if I am a skilled grifter, I make sure a p value favorable to you is "chosen" for the first couple of games -- close to 1.0 -- to get you hooked. You can't just clobber the mark all at once, like that time at SFN when a New Orleans street urchin attempted to forcibly shine my sneakers for $80.)

Here's a proof of the result. (If you read LabKitty for the charming wiseassery and not the boss math derivations, you can skip down to the pictures. The next paragraph just proves the results using the applicable machinery, much like Joyce proved by algebra that Hamlet's ghost was Shakespeare's father. We'll return to charming wiseassery after.)

We have a mixed 2-stage experiment. Call the probability of getting heads P and the number of flips to get a head N. (They are capital letters because they are random variables.) If P=p, then the expected value of N is just 1/p (it's a geometric random variable with parameter p). In symbols: E(N|P=p) = 1/p. As such, applying the Theorem of Total Expectation we obtain: E(N) = ∫ E(N|P=p) f(p) dp. But f(p) = 1 and the integral goes from zero to one (because P is uniform on [0 1]). We obtain

  E(N) = ∫0,1 1/p dp = ln |p|│0,1 = 0 – (-∞) = ∞

or, infinity flips. QED.

I didn't believe this either, so I coded it up in MATLAB to see what the Hecuba is going on (what I call "Monte Carlo simulation" on my CV). The following plot shows the results of repeating the game 1000 times (game # on the x-axis; number of flips for that game on the y-axis):

game # vs # of flips

The dealio is that a tiny p makes for a large N that drags the average up. Although P is uniform, the occasional tiny p and concomitant large N does more violence to E(N) than the occasional large p and concomitant small N (which must be at least 1) does to remedy things. In the limit, as the number of simulations becomes infinite, that average is infinity.

At least that's the story I'm going with.

Translating this into gambling, here's some simulations of playing the Magic Coin game with payoffs of $5, $10, and $20, and starting with $100 in your pocket.

game data for $5/10/20 payoffs

As before, game # is on the x-axis, but now the y-axis is the amount of money in your pocket. A crash to zero means you have gone bankrupt. Each plot shows three different games or, if you'd like, three different players. With a $5 payoff (top), the mark doesn't stand a chance. With a $10 payoff (middle), players hang in there longer and earn a lot more, but by game 1000 all their money still belong to us. Only with a payoff of $20 (bottom) do any players make it past 1000 games and they rack up some impressive winnings. Here, the red player is already north of $10,000 by game 800, but he or she will get got eventually. Math doesn't lie (although it frequently obfuscates).

Footnote: there's known wonkyness in the MATLAB random number generator in that even though it's advertised as U[0,1] it doesn't return values that go all the way down to zero -- at least not in the version I'm using. Ergo, the simulations shown here are more favorable to the player than they should be. They're meant to be illustrative.

Clearly, Magic Coin makes for a lousy casino game from the casino's perspective. It takes too long for players to lose their shirt with even with a tinpot payoff, and it's too easy to get too far ahead before that happens -- far enough ahead than any non-psychotic would cash in their chips and make for the buffet. Still, the weirdness stands: an infinite number of expected flips. Go figure.

As granddaddy used to say: mathematics doesn't teach you how to think correctly, it teaches you how easy it is to think incorrectly.

Right before they hung him.

1 comment: