Wednesday, September 11, 2024

Independence conglomerability

Conglomerability says that if you have an event E and a partition {Ri : i ∈ I} of the probability space, then if P(ERi) ≥ λ for all i, we likewise have P(E) ≥ λ. Absence of conglomerability leads to a variety of paradoxes, but in various infinitary contexts, it is necessary to abandon conglomerability.

I want to consider a variant on conglomerability, which I will call independence conglomerability. Suppose we have a collection of events {Ei : i ∈ I}, and suppose that J is a randomly chosen member of I, with J independent of all the Ei taken together. Independence conglomerability requires that if P(Ei) ≥ λ for all i, then P(EJ) ≥ λ, where ω ∈ EJ if and only if ω ∈ EJ(ω) for ω in our underlying probability space Ω.

Independence conglomerability follows from conglomerability if we suppose that P(EJJ=i) = P(Ei) for all i.

However, note that independence conglomerability differs from conglomerability in two ways. First, it can make sense to talk of independence conglomerability even in cases where one cannot meaningfully conditionalize on J = i (e.g., because P(J=i) = 0 and we don’t have a way of conditionalizing on zero probability events). Second, and this seems like it could be significant, independence conglomerability seems a little more intuitive. We have a bunch of events, each of which has probability at least λ. We independently randomly choose one of these events. We should expect the probability that our randomly chosen event happens to be at least λ.

Imagine that independence conglomerability fails. Then you can have the following scenario. For each i ∈ I there is a game available for you to play, where you win provided that Ei happens. You get to choose which game to play. Suppose that for each game, the probability of victory is at most λ. But, paradoxically, there is a random way to choose which game to play, independent of the events underlying all the games, where your probability of victory is strictly bigger than λ. (Here I reversed the inequalities defining independence conglomerability, by replacing events with their complements as needed.) Thus you can do better by randomly choosing which game to play than by choosing a specific game to play.

Example: I am going to uniformly randomly choose a positive integer (using a countably infinite fair lottery, assuming for the sake of argument such is possible). For each positive integer n, you have a game available to you: the game is one you win if n is no less than the number I am going to pick. You despair: there is no way for you to have any chance to win, because whatever positive integer n you choose, I am infinitely more likely to get a number bigger than n than a number less than or equal to n, so the chance of you winning is zero or infinitesimal regardless which game you pick. But then you have a brilliant idea. If instead of you choosing a specific number, you independently uniformly choose a positive integer n, the probability of you winning will be at least 1/2 by symmetry. Thus a situation with two independent countably infinite fair lotteries and a symmetry constraint that probabilities don’t change when you swap the lotteries with each other violates independence conglomerability.

Is this violation somehow more problematic than the much discussed violations of plain conglomerability that happen with countably infinite fair lotteries? I don’t know, but maybe it is. There is something particularly odd about the idea that you can noticeably increase your chance of winning by randomly choosing which game to play.

1 comment:

Alexander R Pruss said...

By the Sierpinski-Freiling argument, this conglomerability principle is incompatible with the Continuum Hypothesis. I'm somewhat skeptical of CH, though.