Monday, August 24, 2020

Invariance under independently chosen random transformations

Often, a probabilistic situation is invariant under some set of transformations, in the sense that the complete probabilistic facts about the situation are unchanged by the transformation. For instance, in my previous post I suggested that a sequence of fair coin flips should be invariant under the transformation of giving a pre-specified subset of the coins an extra turn-over at the end and I proved that we can have this invariance in a hyperreal model of the situation.

Now, a very plausible thesis is this:

Randomized Invariance: If a probabilistic situation S is invariant under each member of some set T of transformations, then it is also invariant under the process where one chooses a random member of T independently of S and applies that member to S.

For instance, in the coin flip case, I could choose a random reversing transformation as follows: I line up (physically or mentally) the infinite set of coins with an independent second infinite set of coins, flip the second set of coins, and wherever that flip results in heads, I reverse the corresponding coin in the first set.

By Randomized Invariance, doing this should not change any of the probabilities. But insisting on this case of Randomized Invariance forces us to abandon the idea that we should assign such things as an infinite sequence of heads a non-zero but infinitesimal probability. Here is why. Consider a countably infinite sequence of fair coins arranged equidistantly in a line going to the left and to the right. Fix a point r midway between two successive coins. Now, use the coins to the left of r to define the random reversing transformation for the coins to the right of r: if after all the coins are flipped, the nth coin to the left of r is heads, then I give an extra turn-over to the nth coin to the right of r.

According to Randomized Invariance, the probability that all the coins to the right of r will be tails after the random reversing transformations will be the same as the probability that they were all tails before it. Let p be that probability. Observe that after the transformations, the coins to the right of r are all tails if and only if before the transformations the nth coin to the right and the nth coin to the left showed the same thing (for we only get tails on the nth coin on the right at the end if we had tails there at the beginning and the nth coin on the left was tails, or if we had heads there at the beginning, but the heads on the nth coin to the left forced us to reverse it). Hence, p is also the probability that the corresponding coins to the left and right of r showed the same thing before the transformation.

Thus, we have shown that the probability that all the paired coins on the left and right equidistant to r are the same (i.e., we have a palindrome centered at r) is the same as the probability that we have only tails to the right of r. Now, apply the exact same argument with “right” and “left” reversed. We conclude that the probability that the coins on the right and left equidistant to r are always the same is the same as the probability that we have only tails to the left of r. Hence, the probability of all-tails to the left of r is the same as the probability of all-tails to the right of r.

And this argument does not depend on the choice of the midpoint r between two coins. But as we move r one coin to the right, the probability of all-tails to the right of r is multiplied by two (there is one less coin that needs to be tails) and the probability of all-tails to the left of r is multiplied by a half. And yet these numbers have to be equal as well by the above argument. Thus, 2p = p/2. The only way this can be true is if p = 0.

Therefore, Randomized Invariance, plus the thesis that all the non-random reversing transformations leave unchanged the probabilistic situation (a thesis made plausible by the fact that even with infinitesimal probabilities, we provably can have a model of the probabilities that is invariant under these transformation), shows that we must assign probability zero to all-tails, and infinitesimal probabilities are mistaken.

This is, of course, a highly convoluted version of Timothy Williamson’s coin toss argument. The reason for the added complexity is to avoid any use of shift-based transformations that may be thought to beg the question against advocates of non-Archimedean probabilities. Instead, we simply use randomized reversal symmetry.

2 comments:

IanS said...

Non-conglomerability.

Defenders of infinitesimal probabilities must in any case reject conglomerability. So they will not be persuaded by this application of it.

To spell out the connection, sketch the argument like this: Fix a mirror position. Say P(all Tails to the right) = ε. Conditional on any specified sequence to the left, P(Palindrome) = P(all Tails to the right) = ε. Therefore unconditionally P(Palindrome) = ε.

On Randomized Invariance itself, I guess defenders of infinitesimal probabilities would say that if you want to randomize anything, you have to explicitly model the randomization.

Alexander R Pruss said...

You can look at Randomized Invariance as a special case of conglomerability. But note that even if one rejects conglomerability in general, one can accept special, particularly plausible cases of it. I think Randomized Invariance is one such special, particularly plausible case.