Let Gp be the law of gravitation that states that F = Gm1m2/rp, for some real number p. There was a time when it was rational to believe G2. But here is a problem. When 0 < |p − 2|<10−100 (say), Gp is practically empirically indistinguishable from G2, in the sense that within the accuracy of our instruments it predicts exactly the same observations. Moreover, there are uncountably many values of p such that 0 < |p − 2|<10−100. This means that the prior probability for most (i.e., all but at most countably many) such values of p must have been 0. On the other hand, if the prior probability for G2 had been 0, then the posterior probability would have always stayed at 0 in our Bayesian updates (because the probability of our measurements conditionally on the denial of G2 never was 0, which it would have to have been to budge us from a zero prior).
So, G2 is exceptional in the sense that it has a non-zero prior probability, whereas most hypotheses Gp have zero prior probability. This embodies a radical preference for a more elegant theory.
Let N be the set of values of p such that the rational prior probability P(Gp) is non-zero. Then N contains at most countably many values of p. I conjecture that N is the set of all the real numbers that can be specifically defined in the language of mathematics (e.g., 2, 3.8, eπ and the smallest real root of z7 + 3z6 + 2z5 + 7πz3 − z + 18).
If this is right, then Bayesian regularity—the thesis that all contingent hypotheses should have non-zero probability—should be replaced by the weaker thesis that all contingent expressible hypotheses should have non-zero probability.
Note that all this doesn’t mean that we are a priori certain that the law of gravitation involves a mathematically definable exponent. We might well assign a non-zero probability to the disjunction of Gp over all non-definable p. We might even assign a moderately large non-zero probability to this disjunction.
9 comments:
If we go only on the empirical measures of G (see phys.org/news/2015-04-gravitational-constant-vary.html), we see that the factor of 2 in r-squared may vary much more than you have stated-- its possible measure, as part of the measure of G, varies by more than 1 part in 10^-4 according to that article.
But since the squared factor is a necessary consequence of the model of force in 3D space in physics, the confidence that the exponent is exactly 2 is really our Bayesian confidence that the accepted model of force in physics is true, which is why that article assumes exactly 2 in R squared and so attributes the variation seen to our measure of G.
Dr Pruss, are "the laws of physics" a "thing" or "things"? I spoke with a philosophically adept theist who denied this. If they aren't things then what are they? If they are only concepts, then do they not still refer to an objectively existing thing/s? If they aren't things in themselves, then would they not at least be nested in a thing/s?
Also, would the laws of physics not be concrete? That is, having causal power?
I always though they were contingent concrete things. I am very confused now.
William:
I have no idea what the actual bounds on the exponent are. The measurements of G are not the only relevant data for the exponent. I assume that if the exponent is other than 2, then there will be a further deviation from ellipticity of orbits in two-body cases, and presumably that could reveal itself in models.
I agree that an exponent of 2 is intuitively most natural when we think of the influence as spreading out over a sphere. But as far as I know, nothing logically requires it. It is logically possible that the attractive force would even get stronger as one got further away.
Nelson:
I think they are constituted by the patterns of arrangement of the Aristotelian forms. http://alexanderpruss.com/papers/Forms.html
There is a relevant discussion here: https://physics.stackexchange.com/questions/22010/the-distance-square-in-the-newtons-law-of-universal-gravitation-is-really-a-squ
From the articles linked there, the actual bounds on the exponent are much better than 10^-4. The 1894 article by Hall proposed a change in exponent of the order of 10^-7 to account for the precession of Mercury. If Hall's calculations are right, then since that precession was measurable, it follows that a variation of the order of 10^-7 would have been observed.
Serious problem with my proposal: One cannot define definability in ZFC on pain of a contradiction.
Useful reference: https://arxiv.org/pdf/1105.4597.pdf
The problem noted looks fatal now. For it may well be that all reals are definable, absurd as that seems.
Thank you very much for your assistance Dr Pruss. I'm finding the paper interesting so far. It makes sense to me because I am quite unconvinced by the Humean view.
Maybe it's not fatal. Assuming an inaccessible cardinal, there is a model of set theory in which there is a countable set of definable reals.
A second problem is the possibility of countable definable sets of reals all of whose members are non-definable. If we assign zero measure to each real in such a set, by countable additivity we need to assign zero measure to the whole set, but the whole set is definable so it should have non-zero measure.
Post a Comment