Tuesday, February 28, 2017

An unimpressive fine-tuning argument

One of the forces of nature that the physicists don’t talk about is the flexi force, whose value between two particles of mass m1 and m2 and distance r apart is given by F = km1m2r and which is radial. If k were too positive the universe would fall apart and if k were too negative the universe would collapse. There is a sweet spot of life-permissivity where k is very close to zero. And, in fact, as far as we know, k is exactly zero. :-)

Indeed, there are infinitely many forces like this, all of which have a “narrow” life-permitting range around zero, and where as far as we know the force constant is zero. But somehow this fine-tuning does not impress as much as the more standard examples of fine-tuning. Why not?

Probably it’s this: For any force, we have a high prior probability, independent of theism, that it has a strength of zero. This is a part of our epistemic preference for simpler theories. Similarly, if k is a constant in the laws of nature expressed in a natural unit system, we have a relatively high prior probability that k is exactly 1 or exactly 2 (thought experiment: in the lab you measure k up to six decimal places and get 2.000000; you will now think that it’s probably exactly 2; but if you had uniform priors, your posterior that it’s exactly 2 would be zero).

But his in turn leads to a different explanatory question: Why is it the case that we ought to—as surely we ought, pace subjective Bayesianism—have such a preference, and such oddly non-uniform priors?

5 comments:

  1. You explained it yourself: the epistemic preference for simpler theories. If you are asking why we should prefer simpler theories, the reason is that reality prefers them. And if you are asking why reality prefers them, it is because it is mathematically necessary for simpler theories, on average, to have a higher probability. (This is related to the demonstrated impossibility of picking an integer with uniform probability over all integers; on average lower integers have to be more probable.)

    ReplyDelete
  2. *If* there are probabilities between integers, then, yes, higher ones must on average be less probable. But such things are irrelevant in practice, because they say nothing about how things must go for the first 10^10000 integers. You can have on balance a preference for smaller numbers, and yet prefer numbers of the order of magnitude of 10^100 very strongly.

    ReplyDelete
  3. I’m not convinced that either we or nature have a preference for integer constants. (Check out the story of Dirac, QED and the electron spin g-factor, measured as 2.002319304361. Another example: relative isotopic masses are very nearly, but not exactly, integers) Where there are simple numerical ratios, they usually arise from combining identical units. Think of molar ratios in chemistry, or Miller indices in crystallography.

    We do (and should, I think) tend to stick with our existing ideas until we have good reasons to change them. (Think Kuhnian paradigm shift, or religious conversion). That’s why we take forces that “physicists don’t talk about” as zero (strictly, non-existent) until we get strong evidence to the contrary. I doubt that this can be fitted into a Bayesian framework.

    “Classical” statisticians represent this status quo bias by the so-called “null hypothesis”. Bayesians doubt that this makes sense. There are endless debates on this in the statistical literature.

    ReplyDelete
    Replies
    1. Surely there is a point where we'd snap to exactly 2. Likewise, if the digits started spelling out (in a natural encoding) the Book of Genesis in Hebrew, after a while (ten characters?) we'd come to expect that the next digit will also match Genesis.

      Delete
  4. As your Genesis example shows, we are drawn to a plausible theory (“God is trying to tell us something”) rather than to particular numbers. Of course, it may be easier to dream up theories for integers than for other numbers.

    ReplyDelete