Wednesday, September 2, 2015

From a past-infinite causal sequence to a paradoxical lottery: A cosmological argument

Infinite fair lotteries are well-known to be paradoxical. Let's say that an infinite fair lottery is played twice with tickets 1,2,3,.... Then whatever number wins first, you can be all but perhaps certain that in the next run of the lottery a bigger number will win (since the probability of any particular number winning is zero or infinitesimal, so the probability that the winner is a member of the finite set of numbers smaller than or equal to the first picked number is zero or infinitesimal). So as you keep on playing, you can be completely confident that the next number picked will be bigger than the one you just picked. But intuitively that's not what's going to happen. Or consider this neat paradox. Given the infinite fair lottery, there is a way to change the lottery that makes each ticket infinitely more likely to win. Just run a lottery where the probability of ticket n is 2-n (which is infinitely bigger than the zero or infinitesimal probability in the paradoxical lottery)

What makes the infinite fair lottery paradoxical is that

  1. there is a countable infinity of tickets
and
  1. each ticket has zero or infinitesimal chance of winning.
Let's stipulate that a lottery is "paradoxical" if and only if it satisfies (1) and (2).

Suppose now that a past-infinite causal sequence is possible (e.g., my being caused by my parents, their being caused by theirs, and so on ad infinitum). Then the following past-infinite causal sequence is surely possible as well. There is a machine that has always been on an infinite line with positions marked with integers: ...,-3,-2,-1,0,1,2,3,.... Each day, the machine has tossed a fair coin. If the coin was heads, it moved one position to the right on the line (e.g., from 2 to 3) and if it was tails, one position to the left (e.g., from 0 to -1). The machine moved in no other way.

We can think of today's position of the machine as picking out a ticket from a countably infinite lottery. Moreover, this countably infinite lottery is paradoxical. It satisfies (1) by stipulation. And it's not hard to argue that it satisfies (2), because of how random walks thin out probability distributions. (And all we need is finite additivity for the argument.)

So if past-infinite causal sequences are possible, paradoxical lotteries are as well. But paradoxical lotteries are not possible, I say. So past-infinite causal sequences are not possible. So there is an uncaused cause.

25 comments:

  1. How do we verify that every number has a zero or infinitesimal probability? I can make sense of each number's having at each time some probability that the machine would reach it in n steps-- a finite probability falling away as the present distance to the machine grows. But I'm not sure that I can make sense of its having a timeless/eternal? probability that the machine would be there now.

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. I was thinking something like this. Suppose at time n (in days) the probability of being in position x is p_n(x). Then it's not hard to show that p_m(x) < C/sqrt(m-n) for any m>n, for some constant C (an absolute number; something independent of x, n, m and p_n). (The worst case scenario is where p_n(x) has all of its weight at x when m is even and at x-1 and/or x+1 when m is odd. Using that worst case scenario and a Gaussian estimate of the binomial distribution we get the indicated inequality. I haven't checked the details to be honest, but it should work.)

    Thus, p_m(x) < C/sqrt(m-n) for every n smaller than m. But that can only be true if p_m(x) is zero or infinitesimal. Hence at all times all the probabilities are at most infinitesimal.

    However, the argument above assumes that there always IS a probability of the machine being at x at time n. If there isn't, we get some sort of paradoxicality, too, I guess.

    BUT I can re-run the argument on the weaker assumption that our credences as to the positions are modeled by a family of probability distributions. The above argument then can be used to show that every probability distribution in the family has the property that it assigns at most infinitesimal probability to every position at every time.

    If the credences can neither be modeled by a probability distribution nor by a family of probability distributions, then we have something really quite paradoxical.

    ReplyDelete
  4. Yes, I was thinking along the lines that we need not assign credences until we have more information.

    With the information as stated, we assign no credences. We adopt as a Bayesian prior a family of improper uniform distributions, but we don’t take them as credences.(So we avoid the paradoxes. You may think this is a fudge...) We use them only to calculate credences when taken together with other information. For example, if we are told that the machine is in position zero at time zero, we can calculate spreading binomial probabilities at positive times and similar spreading binomial credences at negative times, the more negative the wider.

    In any case, the difficulty seems to come from the improper distribution rather than the past-infinite causation. Suppose for example that the machine is fixed for all time. There is no past-infinite causal sequence, but we would like to use an improper uniform distribution. Now suppose the machine is moving on a ring with a finite number of positions - heads moves right, tails moves left. There is a past-infinite causal sequence, but an unparadoxical finite uniform distribution.

    ReplyDelete
  5. It's possible the argument could be made to work, but I'm not convinced yet. Two things which are issues for me:

    1. "Then the following past-infinite causal sequence is surely possible as well. There is a machine that has always been on an infinite line..." I am not convinced that it follows that this is surely possible. Physics has a way of preventing possibilities that cause paradoxes, even without preventing more generic possibilities that seem to allow for paradoxes. For example, you might think that instantaneous action at a distance implies the possibility of changing the past, given relativity, but quantum mechanics effectively allows instantaneous action at a distance (unless you accept a many worlds account) without permitting the effect of changing the past.

    2. Even without seeing particular flaws in it, I have a problem with arguments that involve infinite quantities. Otherwise one might use something like Zeno's argument to prove that motion is impossible. It may simply be that we cannot think well enough about infinity to prove anything with it, even if infinities can exist in reality.

    ReplyDelete
  6. Ian:

    Suppose I say: "You give me a dollar if the machine is at zero and I'll give you ten if it's not." Wouldn't you take that?

    You might try using conglomerability intuitions here. Wherever the machine was a thousand steps ago, it's pretty unlikely it's at zero now.

    I agree that not every dynamics gives rise to paradox.

    Here's another fun case which doesn't give rise to paradox. The machine moves on the positions 0,1,2,3,... as follows: It has a probability of 0.5 of standing still, 0.3 of moving left and 0.2 of moving right, where moving left from position 0 counts as standing still. What should my credences be? Intuitively, we would expect them to be biased on favor of smaller numbers. In fact, it seems we want them to equal the steady-state probabilities that solve the equations: p(0) = 0.8 p(0) + 0.3 p(1) and p(n) = 0.5 p(n) + 0.2 p(n-1) + 0.3 p(n+1) for positive n. The solution is: 1/3, (1/3)(2/3), (1/3)(2/3)^2, ....

    Both this unparadoxical case and your unparadoxical case suggest that this sort of infinite causal process, if the process is possible, works as an effective shuffling (the relevant feature of an effective shuffling is that makes the initial arrangement irrelevant). But in my paradoxical version, it shuffles us to a paradoxical probability. :-)

    Here's another way to argue that in my paradoxical case (or, rather, in a variant of it that I will describe shortly) there really is a paradoxical probability. Consider this family of backwards-infinite stochastic processes indexed by a parameter x that must be at least 1/4. The machine moves on 0,1,2,3,.... When it's at zero it has probability 1/2+x of staying still and a probability of 1/2-x of moving right. When it's not at zero, it has probability 1/2 of staying still, probability x of moving left and probability 1/2-x of moving right.

    When x > 1/4, the probability of moving towards zero is always higher than the probability of moving away from zero. The equilibrium probability will be something like p(n) = a b^n, where b = (1/2x)-1 and a is chosen to normalize correctly, with perhaps a tweak at the origin (haven't checked). For x > 1/4, b is less than 1, and so this is well-defined. The closer x is to 1/4, the closer b is to 1, and the smaller the normalization factor a must be to compensate--indeed the normalization factor a must go to zero as x approaches 1/4 (we could find the formula for a, but why bother?).

    OK, so now the closer x is to 1/4, the lower the probability that the machine is at any particular location, since that probability can never exceed a. Intuitively, if x is at the critical value of 1/4, the probability of being at any particular location should thus be zero or at best infinitesimal.

    It would be really odd if for x = 0.25000000000000000000001 you were willing to make a bet on the machine not being at location 0 at some odds, but you weren't willing to make a bet at the same odds for x = 0.25, given that as x goes down, the process gets less and less biased towards zero. But any bet you would make on the machine not being at 0, you should be willing to make about its not being at any other location.

    ReplyDelete
  7. Well, yes, I would accept your first bet. But I would be acting on intuition rather than credences in the sense of probability/decision theory. And I will worry about Dutch bookies if I actually meet one...

    Your original example, as you say, shuffles to a paradoxical improper uniform distribution. But indifference applied to a machine in an unknown fixed position gives the same distribution. I have no reason to reject a fixed machine as impossible, so I don’t reject the original setup. What is worrying me is that though the setup seems possible in your “physics-y” sense, I don’t see how the maths works.

    On your power law example, with x = 0.25, the setup seems possible but is not. Here is a variation that illustrates the problem is its simplest form. The machine lives on 0, 1, 2, ... It moves deterministically one step to the right at each time interval. It is impossible that this setup could be backwards infinite. (If the machine is at position n now, where was it n+1 time units ago?) Your example is more subtle, but the idea is the same.

    ReplyDelete
  8. Ian:

    "But indifference applied to a machine in an unknown fixed position gives the same distribution."

    I think it's hard to come up with a case where indifference applies. We need to have some way of referring to positions. The obvious way is with numbers attached in some way. But then some numbers are special, namely those that are more simply describable. Compare: A priori, we should think it more likely that if the gravitational force satisfies the formula Gmm'/r^p, then p is 2 rather than 2+10^-100. (For if we don't think this a priori, we can't think it a posteriori, either, as all our data fits equally well with both hypotheses.) So we're not indifferent between the values of p.

    I am sceptical of probabilities that don't come from something like stochastic processes.

    ReplyDelete
  9. A way to criticize my initial setup (and by extension my power-law setup) would be this. Imagine that God first foreknows all the results of the coin flips, and then directly brings it about at each time n < 0 that the machine is in such a position that (a) its movements will accord with the coin flips (e.g., if heads, move right; if tails, move left) and (b) the coin ends up at location 0 at time 0. (God can figure out where the coin should be at time n<0 by working backwards using his foreknowledge of the coin flips.) In this scenario we have everything I said in the post. But nonetheless it is fixed and certain that the coin will be. In particular, given this setup, there is an unparadoxical probability for the coin at time 0: p(0)=1 and p(x)=0 if x isn't 0.

    (So the argument in my post cannot be sound. Note that an extension of the above example also shows that my power law story might not result in a power law.)

    In my foreknowledge story, the coin's position at time n isn't caused or explained by (a) the result of the flip at time n-1 and (b) the position of the coin at time n-1. Rather, the coin's position at time n is explained by God's plan for the position at time n+1 and the result of the flip at time n-1. So perhaps my post can be fixed by specifying that the position of the coin is caused and explained by the result of the flip at the previous time and the position of the coin then.

    That said, I now feel the force of Ian's last objection concerning the machine that always moves to the right but travels on 0,1,2,....

    ReplyDelete
  10. That seems reasonable. I was thinking on similar lines. Note that if God forced position 0 at time 0, the distributions at other times would be exactly those calculated by the Bayesian approach in my first post.

    I’m not sure I understand your statement that the power law example might not result in a power law, at least for x greater than 0.25. God could force a particular outcome at a particular time, but at times much later (or earlier) the distribution would settle to the equilibrium.

    I was wrong to accept the original setup but reject power law example with x = 0.25. They stand or fall together. Here is a nifty reflection argument (not original). To keep the explanation simple, think about a modified version in which the machine moves left with probability 0.5 or right with probability 0.5, except at position zero, where it moves right with probability 1. (This is like your power law setup with x = 0.25, but without the option to stay still.) This setup has exactly the same dynamics as the modulus of the position of the machine in your initial setup.

    My story about the machine that always moves to the right on 0,1,2... would apply if x were less than 0.25, but not to the neutral case x = 0.25.

    ReplyDelete
  11. Actually, the conglomerability intuitions I referred to only require a finite partition, so they are legit.

    ReplyDelete
  12. Let me clarify the last remark. Go back to my original story.

    Finite Conglomerability: Suppose it's certain that one of A_1,...,A_m happened (for finite m). Suppose that P(B|A_i) is less than or equal to p for all i. Then P(B) is less than or equal to p.

    (Countable Conglomerability would be a problematic assumption in contexts where infinite lotteries are happening, since Countable Conglomerability typically requires countable additivity. But Finite Conglomerability is very intuitive and doesn't seem to have similar problems.)

    Now, fix a negative time n. Let B be the event: machine is at location x at time 0. Consider the finite partition D,A(-|n|),...,A(0),...,A(|n|) where
    D = machine is more than |n| steps away from x at time n
    A(y) = machine is at x+y at time n
    Then P(B|C) = 0 and P(B|A(y)) < C/n^(1/2) for all y. By Finite Conglomerability P(B) cannot exceed C/n^(1/2). But since n was arbitrary, it follows that P(B) is less than every positive real number, and hence is zero or infinitesimal.

    ReplyDelete
  13. Dr Pruss:
    Apologies for the late response. This has been vaguely niggling me for the last few days. You said four posts up that maybe the setup could be fixed but that you were concerned by my objection (in effect) that in some setups past-infinite paths may not be possible.

    You were right to be concerned. Here is a tempting fix that does not work. As in the original setup, the machine moves on ... -1, 0, 1, ... Suppose that there are an infinite number of independent coin tosses, one for each integer time-position combination. Wherever the machine is, its next step is determined by the outcome of the toss at its current time and position – heads one step right, tails one step left.

    This fixes one problem. Even knowing all the coin toss outcomes, God could not in general force the machine to be in a particular position at a particular time. (Because some time/positions will be inaccessible. If the toss at time -1, posn -1 is tails and the toss at time -1, posn +1 is heads, then the machine cannot be at posn 0 at time 0.)

    The catch is that with probability 1, there are no past-infinite paths. Sketch proof (if I’m thinking straight - which is always doubtful): Think about a target position at a target time. At any earlier time, the set of points from which the target can be reached is a single interval (up to the usual odd/even issue). The left and right boundaries of the interval change with time as independent random walks. So the width also follows a random walk. With probability 1, a random walk starting anywhere crosses zero. When this happens, there is no path to the target point from any earlier time. This applies (with probability 1) to any target point at time zero, hence (with probability 1) to the (countable) union.

    ReplyDelete
  14. Ian:

    You write: "At any earlier time, the set of points from which the target can be reached is a single interval (up to the usual odd/even issue). The left and right boundaries of the interval change with time as independent random walks." It's not clear to me why. I would think that (modulo the even/odd issue), the the interval from which the target can be reached simply monotonically extends as one goes backwards.

    Now, granted, the probability of reaching a particular target point at a particular target time is zero. So with probability one, one will miss that point. That's true. But of course the probabilities here aren't going to be countably additive.

    ReplyDelete
  15. To be clear, all this is given a particular set of coin toss outcomes. It was not obvious to me until I set up a simulation on an Excel spreadsheet :-). Think about what I wrote about some points being inaccessible, and extend it back.

    ReplyDelete
    Replies
    1. But given a particular infinite sequence of coin flips one can work backwards and there is only one place at any past time from which the target point is accessible, no?

      Delete
  16. To be more explicit, suppose the target can be reached from W consecutive odd numbered points at time T. Then at time T-1 the target can be reached from all the “middle” even numbered points (W-1 of them), the even point to the left (with probability 1/2) and the even point to the right (independently with probability 1/2). This will make W-1, W, W+1 consecutive even numbered points with probabilities 1/4, 1/2, 1/4. So the number of points from which the target can be reached follows a random walk without drift.

    ReplyDelete
  17. By the way, some of the math becomes easier if the machine is moving on a continuous line, and instead of coin flips there are independent N(0,1) Gaussians being picked, which at each time step are added to the position of the machine. And the lottery is run by looking for the integer n such that the machine is in [n,n+1). Then we don't have to worry about the even/odd issue, independent Gaussians are easier to add than binomials, and every point is accessible from every point.

    On the other hand, my finite additivity argument no longer works, precisely because every point is accessible from every point.

    ReplyDelete
  18. No, there is not (in general) only one possible backwards path from the target point. Suppose the target is time 0, posn 0. Then if the coin toss at time -1, posn -1 is heads and the toss at time -1, posn +1 is tails, there will be 2 backward paths to time -1. Remember, in my model there is an infinite array of coin tosses, only some of which the machine uses.

    It took me a while to start thinking backwards. That’s why I’m not sure I’m thinking straight. :-)

    I will have to think about the continuous version.

    ReplyDelete
  19. But I wouldn't model it with a coin press at every spatial location. Not sure how to model it better.

    ReplyDelete
  20. Good luck! I mean this sincerely, I will continue to think about it, and I look forward to reading what you come up with.

    ReplyDelete
  21. This comment has been removed by the author.

    ReplyDelete
  22. I am inclined to think, by the way, that results like yours also put a damper in the idea of an infinite causal past. For if stuff has been happening for an infinite amount of time, we would expect there to have been random processes of all sorts happening. And it would be strange if such processes were possibly but had to involve compact state spaces or have drifts that rendered them unproblematic. One needs to be cautious, of course, because it's unsurprising that (as per one of your comments) one can't have a past-infinite system whose temperature rises by one degree every day (there not being negative temperatures), though one can have a past-infinite system whose temperature falls by one degree every day.

    By the way, would you be willing to give me your full name (by email, say) so I can cite your comments and/or help properly in the book I'm writing? I am really grateful for your discussion (on this and other posts).

    ReplyDelete
  23. I have sent a email with title "Who is IanS?" to the address on you home page.

    ReplyDelete