Monday, October 20, 2014

Limiting frequencies and probabilities

You are one of infinitely many blindfolded people arranged in a line with a beginning and no end. Some people have a red hat and others have a white hat. The process by which hat colors were assigned took no account of the order of people. You don't know where you were in the line. Suppose you learn the exact sequence of hat colors, say, RWRRRRWRWRWWWWRWWWR.... But you still don't know your position. What should your probability be that your hat is red?

A natural way to answer this is to compute the limiting frequency of reds. Let R(n) be the number of red hats among the first n people, and then see if R(n)/n converges to some number. If so, then that number, call it r, seems to be a reasonable value for the probability. Call the assignment of r to the probability when the limit r exists the frequency rule.

Here's a curious and simple thing I hadn't noticed before. If you think the frequency rule is always the right rule, then for all integers n, you are committed to being almost certain that your position is greater than n. Here's why. Suppose that the sequence that comes up is n white hats followed by just red hats. The limiting frequency of R(n)/n is 1. So by the frequency rule, you're committed to assigning probability 1 to having a red hat. But since you have a red hat if and only if your position is greater than n, you are committed to assigning probability 1 to your position being greater than n. And since there is no connection between the hat color arrangement and the order of people on the line, if you have this commitment after learning the sequence of hat colors, you also had it before. The argument applies for all n, so for all n you must have been almost certain that your position in the sequence is greater than n.

And this in turn leads to the paradoxes of nonconglomerability. For instance, suppose that I flip a fair coin. If it's heads, I let N be your position number. If it's tails, I choose a number N at random such that P(N=n)=2n. In either case, I reveal to you the value of N, but not how the flip went. For any number n, the probability that N=n is zero given heads (since you're almost certain that your position is greater than n), and the probability that N=n is greater than zero given tails, so by Bayes' Theorem you will be almost certain that the coin landed tails. So I can make you be sure that a coin landed tails, and thereby exploit you in paradoxical ways.

So the frequency rule isn't as innocent as it seems. It commits one to something like an infinite fair lottery.

5 comments:

  1. The process by which hat colors were assigned took no account of the order of people. How would you do this? Not by any ordinary sequential process, either deterministic or random (like tossing coins). Wouldn't it require something like a randomisation over infinite permutations? This would be equivalent to an infinite sequence of infinite fair lotteries. So it is no surprise that you get the usual paradoxes associated such lotteries.

    ReplyDelete
  2. Why not just an infinite number of simultaneous coin flips?

    (Or a sequential infinity, while we're at it, since on the usual models of an infinite sequence of coin throws, the probability space is invariant under permutations, and hence the outcomes don't depend on order.)

    ReplyDelete
  3. If you toss coins, then the probability of any finite number of white hats is zero. So if you condition on it (as the argument does) you expect paradoxes.

    Assigning n white hats to an infinite number of people "without taking account of the order of people" is precisely a fair infinite lottery with n winners. Any procedure that does this (with non-zero probability) will give the usual paradoxes of such lotteries.

    ReplyDelete
  4. I wouldn't say you *expect* paradoxes. Many cases of conditioning on null sets are perfectly fine.

    And it's not like we were assigning n hats to an infinite number of people. That's just how it turned out, an outcome surely within the realm of possibility, if an infinite number of independent fair coin throws is. :-)

    ReplyDelete
  5. Possible but :-) An infinite sequence of fair coin throws can be set up as a supertask. (First throw takes 1/2 sec, second takes 1/4 sec etc.) So can an infinite sequence of such sequences. Even if you do this, with probability 1 none of the sequences will have only a finite number of reds.

    But I agree with your conclusion. Suppose you have a proper prior for your position on the line. Then given the actual sequence of hats, can compute P(red) directly. This will usually disagree with the frequency rule. To make it agree in general, you need an (improper) uniform prior, or a limit of uniform [1,n] priors, or some such. Whatever you do has to give zero effective weight to any finite group of positions.

    ReplyDelete