Tuesday, November 12, 2019

The Incarnation and timelessness

Consider the standard argument against the Incarnation:

  1. Everything that is God is F (omnipotent, omniscient, impassible, etc.).

  2. Everything that is human is non-F.

  3. Christ is God and human.

  4. So, Christ is F and non-F.

  5. Contradiction!

But it is only a contradiction to be F and non-F at the same time: we’ve known this since Aristotle.
Thus the kenotic theologian gets out of the argument by holding that Christ was F prior to the Incarnation and wasn’t F after the Incarnation. (A difficult question for the kenoticist: is he now F?) But that’s contrary to the teaching of the Councils.

However, the “at the same time” observation does not need to lead to kenoticism. In fact, the Christian who is a classical theist should deny that Christ is F and non-F at the same time. For it is strictly false to say that Christ is F at t for any divine attribute F and any time time t, since God has the divine attributes timelessly rather than at a time.

This is not kenoticism. Rather, the view is that Christ is F timelessly eternally and non-F at t (for any t after the beginning of the Incarnation). Kenoticism on this view is metaphysically absurd, because God cannot cease to be F: one can only cease to be something that one used to be, and there is no “used to be” where there is no temporality.

But we sometimes say things like:

  • While he was suffering on the cross, Christ was upholding the existence of the universe.

I think there are two ways of make sense of such statements. First, maybe, things that happen timelessly count honorifically as holding at all times. (Compare David Lewis’s idea that abstract objects count as existing in all his worlds.) Second, the statement can be understood as follows:

  • While he was suffering on the cross, the following proposition was true: Christ is upholding the existence of the universe.

So, orthodox Christians do not actually need to talk of natures to get out of (1)-(5). Of course, if we want to allow—as I think we should—for the logical possibility of multiple simultaneous incarnations, then the temporal qualification way out won’t help. (Nor will the kenotic solution help in that case, either.)

Note, by the way, that once we realize that there can be timelessly eternal existence, we need to modify Aristotle’s temporal qualification to the law of non-contradiction:

  • it is impossible to be F and non-F in the same respect at the same time or both eternally.

More complications for Dutch Book results

Think of a wager as a sequence of event-payoff pairs:

  • W = ((e1, u1),...,(en, un)).

There are then two different ways to calculate the expected value of the wager. First, directly:

  1. ED(W)=u1P(e1)+...+unP(en).

Second, indirectly by letting UW be the utility function defined by W, i.e., UW = u1 ⋅ 1e1 + ... + un ⋅ 1en (where 1e is the function that is 1 if e happens and 0 otherwise) and then calculating the expected utility of the function UW:

  1. EI(W)=E(UW).

If the credence function P is additive, then the two ways are equivalent. But without additivity, they come apart. Moreover, there is more than one way of calculating E(U) if the credences are inconsistent, but for now I will assume the standard Lebesgue sum way where, assuming U has only finitely many values, E(U)=∑yyP(U = y).

The most common de Finetti Dutch Book Theorem, which says that inconsistent probabilities give rise to a Dutch Book, makes use of the direct way of calculating the values of wagers. Specifically, it considers wagers where you pay an amount x for a chance to win amount y if event E eventuates, and it calculates the value of such a wager as yP(E)−x. However, if instead one uses the indirect method of calculation, the value of such a wager becomes (y − x)P(E)−xP(Ec), where Ec is the complement of E.

This actually makes a real difference to Dutch Book theorems. Consider this inconsistent credence for a coin toss:

  • P(H)=1/4

  • P(T)=1/4

  • P(H&T)=0

  • P(H ∨ T)=1.

Then for any credence function U, it turns out that EI(U)>0 if and only if the expected value of U is positive given the standard consistent fair-toss measure. The reason is this. Either U has the same value at heads and tails or it does not. If it has the same value at heads and tails, then EI(U) has the same value as the expectation using the fair measure, since P agrees with the fair measure regarding H ∨ T. On the other hand, if U has different values at heads and tails, then EI(U)=(1/4)U(H)+(1/4)U(T) which is exactly half of the fair measure’s expectation for U, and hence, again, EI(U)>0 if and only if the fair measure says the expectation is positive. It seems to follow that EI recommends exactly the same wagers as the standard consistent fair-toss measure.

Except that this isn’t quite true, either. For in addition to two ways of calculating expected values, there are two ways of making decisions on their basis in the case where a sequence of wagers is offered:

  1. Accept a wager whose individual expected utility is positive.

  2. Accept a wager when the expected utility of the already-accepted wagers combined with the currently offered wager exceeds the expected value of the combination of the already-accepted wagers.

Here, the combination of two wagers is concatenation. For instance ((e1, u1),(e2, u2)) combiness with ((e3, u3)) to form the wager ((e1, u1),(e2, u2),(e3, u3)). Given consistent credences, we have, E(W1 + W2)=E(W1)+E(W2), and (3) and (4) are equivalent. But, again, for inconsistent credences this additivity property can fail, and so a choice needs to be made between (3) and (4).

Note that (4) is itself an oversimplification. For theoretically, what wagers one accepts earlier on may depend on one’s best estimate as to what wagers will be offered later.

All in all, I know of five utility maximization decision procedures for sequences of wagers, generated by the answers to these questions:

  • Direct or indirect utility calculation for a wager? (D or I)

  • If indirect, Lebesgue sum or level set integral for calculating expectations? (LSum or LSet)

  • If indirect, is the presently offered wager combined with previously accepted wagers in calculating expectations? (Indiv or Combo)

For consistent probabilities, these are all equivalent.

Moreover, there are two kinds of Dutch Books. There are Simple Dutch Books, where from the original position the agent accepts a Dutch Book, and Incremental Dutch Books, where after accepting some wagers, the agent goes on to accept a Dutch Book.

What happens with Dutch Books varies between the different procedures, and I am still working out the details. Say that a credence P is monotonic provided that P(∅)=0, P(Ω)=1 and P(A)≤P(B) whenever A ⊆ B. Here is what I have:

  • D: Simple Dutch Books whenever probabilities are inconsistent.

  • I+LSum+Indiv: I conjecture Incremental Dutch Books for some but not all inconsistent monotonic credences.

  • I+LSum+Combo: I conjecture Incremental Dutch Books for all non-additive credences.

  • I+LSet+Indiv: I don’t know.

  • I+LSet+Combo: No Dutch Books of either sort for any monotonic credences.

Sunday, November 10, 2019

The intellect is not higher than the will


  1. The perversion of the higher faculty is worse, other things being equal.
  2. Moral wrongdoing is worse than error, other things being equal.
  3. Moral wrongdoing is the perversion of the will.
  4. Error is the perversion of the intellect.
  5. So, the intellect is not higher than the will.

Thursday, November 7, 2019

Expected utility and inconsistent credences

Suppose that we have a utility function U and an inconsistent credence function P, and for simplicity let’s suppose that our utility function takes on only finitely many values. The standard way of calculating the expected utility of U with respect to P is to look at all the values U can take, multiply each by the credence that it takes that value, and add:

  1. E(U)=∑yyP(U = y).

Call this the Block Way or Lebesgue Sums.

Famously, doing this leads to Dutch Books if the credence function fails additivity. But there is another way to calculate the expected utility:

  1. E(U)=∫0P(U > y)dy − ∫−∞0P(U < y)dy.

Call this the Level Set Way, because sets of points in a space where some function like U is bigger or smaller than some value are known as level sets.

Here is a picture of the two ways:

Blocks vs. Level Sets

On the Block Way, we broke up the sample space into chunks where the utility function is constant and calculated the contribution of each chunk using the inconsistent credence function, and then added. On the Level Set Way, we broke it up into narrow strips, and calculated the contribution of each strip, and then added.

It turns out that if the credence function P is at least monotone, so that P(A)≤P(B) if A ⊆ B, a condition strictly weaker than additivity, then an agent who maximizes utilities calculated the Level Set Way will not be Dutch Booked.

Here is another fact about the Level Set Way. Suppose two credence functions U1 and U2 are certain to be close to each other: |U1 − U2|≤ϵ everywhere. Then on the Block Way, their expected utilities may be quite far apart, even assuming monotonicity. On the other hand, on the Level Set Way, their expected utilities are guaranteed to be within ϵ, too. The difference between the two Ways can be quite radical. Suppose a coin is tossed, and the monotone inconsistent credences are:

  • heads: 0.01

  • tails: 0.01

  • heads-or-tails: 1

  • neither: 0

Suppose that U1 says that you are paid a constant $100 no matter what happens. Both the Block Way and the Level Set Way agree that the expected utility is $100.
But now suppose that U2 says you get paid $99 on heads and $101 on tails. Then the Block Way yields:

  • E(U2)=0.01 ⋅ 99 + 0.01 ⋅ 101 = 1

while the Level Set Way yields:

  • E(U2)=1 ⋅ 99 + 0.01 ⋅ 2 = 99.02

Thus, the Block Way makes the expected value of U2 ridiculously small, and far from that of U1, while the Level Set Way is still wrong—after all, the credences are stupid—but is much closer.

So, it makes sense to think of the Level Set Way as harm reduction for those agents whose credences are inconsistent but still monotone.

That said, many irrational agents will fail monotonicity.

Wednesday, November 6, 2019

Presentism and the Cross

  1. It is important for Christian life that one unite one’s daily sacrifices with Christ’s sufferings on the cross.

  2. Uniting one’s sufferings with something non-existent is not important for Christian life.

  3. So, Christ’s sufferings on the cross are a part of reality.

  4. So, presentism is false.

Monday, November 4, 2019

Velocity and teleportation

Suppose a rock is flying through the air northward, and God miraculously and instantaneously teleports the rock, without changing any of its intrinsic properties other than perhaps position, one meter to the west. Will the rock continue flying northward due to inertia?

If velocity is defined as the rate of change of position, then no. For the rate of change of position is now westward and the magnitude is one meter divided by zero seconds, i.e., infinite. So we cannot expect inertia to propel the rock northward any more. In fact, at this point physics would break down, since the motion of an object with infinite velocity cannot be predicted.

But if velocity (or perhaps momentum) is an intrinsic feature that is logically independent of position, and it is merely a law of physics that the rate of change of position equals the velocity, then even after the miraculous teleportation, the rock will have a northward velocity, and hence by inertia will continue moving northward.

I find the second option to be the more intuitive one. Here is an argument for it. In the ordinary course of physics, the causal impact of physical events at times prior to t1 on physical events after t1 is fully mediated by the physical state of things at t1. Hence whether an object moves after time t1 must depend on its state at t1, and only indirectly on its state prior to t1. But if velocity is the rate of change of position, then whether an object moves via inertia after t1 would depend on the position of the object prior to t1 as well as at t1. So velocity is not the rate of change of position, but rather a quality that it makes sense to attribute to an object just in virtue of how it is at one time.

This would have the very interesting consequence that it is logically possible for an object to have non-zero velocity while not moving: God could just constantly prevent it from moving without changing its velocity.

Friday, November 1, 2019

Guessing and omniscience

Suppose that yesterday you guessed that today I’d freely mow the lawn, and today I did freely mow the lawn. Then, the correctness of your guess is a doxastic good you possessed.

(Note: If the future is open, so that there was no truth yesterday that today I’d mow the lawn, it’s a little tricky to say when you possessed it. For when you guessed, it wasn’t true that you possessed the doxastic good of guessing correctly. Rather, now that it has become the case that this doxastic good is attributable to you.)

Now no one can have a doxastic good that God lacks. Thus, God had to have at least guessed the same thing yesterday. And God has no doxastic bads. So, God never gets anything wrong. But the only plausible way it can be true that

  1. God always gets right the things we guess right, and

  2. God never gets things wrong

is if God has comprehensive knowledge of the future.

Thursday, October 31, 2019

The local five minute hypothesis, the Big Bang and creation

The local five minute hypothesis is that the earth, with everything on it, and the environment five light-minutes out from it, come into existence five minutes ago.

Let’s estimate the probability of getting something like a local five minute hypothesis by placing particles at random in the observable universe. Of course, in a continuous spacetime the probability of getting exactly the arrangement we have is zero or infinitesimal. But we only need to get things right to within a margin of error of a Planck distance for all practical purposes.

The volume of the observable universe is about 1080 cubic meters. The Planck volume is about 10−105 cubic meters. So, getting a single particle at random within a Planck volume of where it has a probability of about 10−185.

But, if we’re doing our back-of-envelope calculation in a non-quantum setting (i.e., with no uncertainty principle), we also need to set the velocity for the particles. Let’s make our margin of error be the equivalent of moving a Planck distance within ten minutes. So our margin of error for velocity in any direction will be about 10−35 meters in 600 seconds, or about 10−38 meters per second. Speeds range from 0 to the speed of light, or about 108 meters per second, so the probability of getting each of the three components of the velocity right is about 10−46, and since we have three directions right is something like 10−138. The probability of getting both the position and velocity of a particle right is then 10−(185 + 138) = 10−323. Yeah, that’s small. Also, there are about 100 different types of particles, and there are a few other determinables like spin, so let’s multiply that by about 10−3 to get 10−326.

The total mass of planetary stuff within around five light minutes of earth—namely, Earth, Mass and Venus—is around 1025 kilograms. There are no more than about 1025 atoms, and hence about 1027 particles, per kilogram. So, we have 1052 particles we need to arrange within our volume.

We’re ready to finish the calculation. The probability of arranging these many particles with the right types and within our position and velocity margins of error is:

  • (10−326)1052 ≈ 10−102.5 × 1052 ≈ 10−1055.

Notice, interestingly, that most of the 55 comes from the number of particles we are dealing with. In fact, our calculations show that basically getting 10N particles in the right configuration has, very roughly, a probability of around 10−10N + 3.

So what? Well, Roger Penrose has estimated the probability of a universe with an initial entropy like ours at 10−10123. So, now we have two hypotheses:

  • A universe like ours came into existence with a Big Bang

  • The localized five minute hypothesis.

If there is no intelligence behind the universes, and if probabilistic calculations are at all appropriate for things coming into existence ex nihilo, the above probability calculations seem about right, and the localized five minute hypothesis wins by a vast margin: 10−1055 to 10−10123 or, roughly, 1010123 to 1. And if probabilistic calculations are not appropriate, then we cannot compare the hypotheses probabilistically, and lots of scepticism also follows. Hence, if there is no intelligence behind the universe, scepticism about everything more than five minutes ago and more than five light minutes from us follows.

Wednesday, October 30, 2019

1+1=3 or 2+2=4

On numerical-sameness-without-identity views, two entities that share their matter count as one when we are counting objects.

Here is a curious consequence. Suppose I have a statue of Plato made of bronze with the nose broken off and lost. I make up a batch of playdough, sculpt a nose out of it and stick it on. The statue of Plato survives the restoration, and a new thing has been added, a nose. But now notice that I have three things, counting by sameness:

  • The statue of Plato

  • The lump of bronze

  • The lump of playdough.

Yet I only added one thing, the lump of playdough or the nose that is numerically the same (without being identical) as it. So, it seems, 1+1=3.

Now, it is perfectly normal to have cases where by adding one thing to another I create an extra thing. Thus, I could have a lump of bronze and a lump of playdough and they could come together to form a statue, with neither lump being a statue on its own. A new entity can be created by the conjoining of old entities. But that’s not what happens in the case of the statue of Plato. I haven’t created a new entity. The statue was already there at the outset. And I added one thing.

Maybe, though, what should be said is this: I did create a new thing, a lump of bronze-and-playdough. This thing didn’t exist before. It is now numerically the same as the statue of Plato, which isn’t new, but it is still itself a new thing. I am sceptical, however, whether the lump of bronze-and-playdough deserves a place in our ontology. We have unification qua statue, but qua lump it’s a mere heap.

Suppose we do allow, however, that I created a lump of bronze-and-playdough. Then we get another strange consequence. After the restoration, counting by sameness:

  • There are two things that I created: the nose and the lump of bronze-and-playdough

  • There are two things that I didn’t create: the statue of Plato and the lump of bronze.

But there are only three things. Which makes it sound like 2+2=3. That’s perhaps not quite fair, but it does seem strange.

Tuesday, October 29, 2019

Sameness without identity

Mike Rea’s numerical-sameness-without-identity solution to the problem of material constitution holds that the statue and the lump have numerical sameness but do not have identity. Rea explicitly says that numerical sameness implies sharing of all parts but not identity.

Does Rea here mean: sharing of all parts, proper or improper? It had better not be so. For improper parthood is transitive.

Proposition. If improper parthood is transitive and x and y share all their parts (proper and improper), then x = y.

Proof: But suppose that x and y share all parts. Then since x is a part of x, x is a part of y, and since y is a part of y, y is a part of x. Moreover, if x ≠ y, then x is a proper part of y and y is a proper part of x. Hence by transitivity, x would be a proper part of x, which is absurd, so we cannot have x ≠ y. □

So let’s assume charitably that Rea means the sharing of all proper parts. This is perhaps coherent, but it doesn’t allow Rea to preserve common sense in Tibbles/Tib cases. Suppose Tibbles the cat loses everything below the neck and becomes reduced to a head in a life support unit. Call the head “Head”. Then Head is a proper part of Tibbles. The two are not identical: the modal properties of heads and cats are different. (Cats can have normal tails; heads can’t.) This is precisely the kind of case where Rea’s sameness without identity mechanism should apply, so that Head and Tibbles are numerically the same without identity. But Tibbles has Head as a proper part and Head does not have Head as a proper part. But that means Tibbles and Head do not share all their proper parts.

Here may be what Rea should say: if x and y are numerically the same, then any part of the one is numerically the same as a part of the other. This does, however, have the cost that the sharing-of-parts condition now cannot be understood by someone who doesn’t already understand sameness without identity.

Friday, October 25, 2019

The present king of Ruritania

Suppose I am a quack and I announce:

  1. These green pills cured the king of Ruritania of lung cancer.

I am lying, of course. The green pills never cured anyone of lung cancer.

But wait. To lie, I have to assert. To assert, there has to be a proposition that is being expressed. But (1) doesn’t express any proposition, because “Ruritania” is a non-referring name.

Maybe, then, (1) is not a lie, but something that is wrong for the same reason that a lie is wrong. For instance, on Jorge Garcia’s account, lying is wrong as it’s a betrayal of the trust solicited by the very same act. If so, then my pretend assertion of (1) might be wrong for exactly the same reason as a lie.

The point can also be made without relying on non-referring proper names. Suppose Jones has lied, cheated, stolen, plagiarized and defenestrated his friends, but reporting doesn’t make his character black enough for my purposes. So I say:

  1. Dr. Jones has lied, cheated, stolen, plagiarized, defenestrated his enemies, and garobulated his friends.

This doesn’t express a proposition. But it’s just as bad as a lie.

Thursday, October 24, 2019

Perdurance and particles

A perdurantist who believes that particles are fundamental will typically think that the truly fundamental physical entities are instantaneous particle-slices.

But particles are not spatially localized, unless we interpret quantum mechanics in a Bohmian way. They are fuzzily spread over space. So particle-slices have the weird property that they are precisely temporally located—by definition of a slice—but spatially fuzzily spread out. Of course, it is not too surprising if fundamental reality is strange, but maybe the strangeness here should make one suspicious.

There is a second problem. According to special relativity, there are infinitely many spacelike hyperplanes through spacetime at a given point z of spacetime, corresponding to the infinitely many inertial frames of reference. If particles are spatially localized, this isn’t a problem: all of these hyperplanes slice a particle that is located at z into the same slice-at-z. But if the particles are spatially fuzzy, we have different slices corresponding to different hyperplanes. Any one family of slices seems sufficient to ground the properties of the full particle, but there are many families, so we have grounding overdetermination of a sort that seems to be evidence against the hypothesis that the slices are fundamental. (Compare Schaffer’s tiling requirement on the fundamental objects.)

A perdurantist who thinks the fundamental physical entities are fields has a similar problem.

A supersubstantialist perdurantist, who thinks that the fundamental entities are points of spacetime, doesn’t run into this problem. But that’s a really, really radical view.

An “Aristotelian” perdurantist who thinks that particles (or macroscopic entities) are ontologically prior to their slices also doesn’t have this problem.

Wednesday, October 23, 2019

Book in Progress: Norms, Natures and God

I have begun work with a working title of Norms, Natures and God, which should be a book on how positing Aristotelian natures solves problems in ethics (normative and meta), epistemology, semantics, metaphysics and mind, but also how, especially after Darwin, to be an intellectually satisfied Aristotelian one must be a theist. The central ideas for this were in my Wilde Lectures.

There is a github repository for the project with a PDF that will slowly grow (as of this post, it only has a table of contents) as I write. I welcome comments: the best way to submit them is to click on "Issues" and just open a bug report. :-)

The repository will disappear once the text is ready for submission to a publisher.

Perdurance and slices

One of the main problems with perdurance is thought to be that it makes intrinsic properties be primarily properties of slices, and only derivatively of the four-dimensional whole.

The most worrisome case of this problem has to do with mental properties. For if our slices have the mental properties primarily, and we only have them derivatively, then that leads to a sceptical problem (how do I know I am a whole and not a slice?) and besides violates the intuition that we have our mental properties primarily.

But someone who accepts a perdurantist ontology and accepts the idea that we are four-dimensional wholes does not have to say that intrinsic properties are primarily had by slices. For a property that involves a relation to one’s parts can still be intrinsic (having one’s parts is surely intrinsic!). Now instead of saying that, say, Bob has temporary property P at time t in virtue of his slice Bt at t having P, we can say that Bob has P in relation to Bt. This is very similar to how relationalist endurantists say that we have our temporary properties in relation to times, except that times are normally thought of as extrinsic to the object, while the slices are parts of the objects.

In fact, this helps save some intuitions of intrinsicness. For instance, it seems to be an intrinsic property of me that my heart is beating. But if t is now and At is my slice now, then At does not seem to intrinsically have the property of heart-beat. It seems that heart-beat is a dynamical property dependent not just on the state of the object at one time but also at nearby times. Thus, if we want to attribute heart-beat to At primarily, then heart-beat will not be intrinsic, as it will depend on At as well as slices At for t′ near t. But if we see my present heart-beat as a property of the four-dimensional worm, a property the worm has in relation to At (as well as neighboring times), then heart-beat can be an intrinsic property—and it can be had primarily by me, not my slices.

It is plausible that mental properties are dynamical as well: that one cannot tell just from the intrinsic properties of a three-dimensional slice whether thought is happening. (This is pretty much certain given materialism, but I think is plausible even on dualism.) So, again, mental properties aren’t going to be intrinsic properties of slices. But they can be primarily the intrinsic properties of four-dimensional persons, had in relation to their slices.

Tuesday, October 22, 2019

Persistence and internal times

Here are some desiderata for a view of the persistence of objects:

  1. Ordinary objects can change with respect to intrinsic properties.

  2. Ordinary objects are the primary bearers of some of the changeable intrinsic properties.

  3. Ordinary objects are literally present at multiple times.

Endurantism is usually allied with some sort of view on which temporary properties are had in relation to times, and hence the temporary properties are relational and not intrinsic. Perdurantism violates 2: it is the stages, not the ordinary objects, that are the primary bearers of the temporary intrinsics. And no primary bearer of a property can change with respect to it. Exdurantism violates 3: ordinary objects only exist at a single time.

Here is a view that yields all three desiderata. Objects have internal times, and these internal times are literally parts of the objects. Changeable intrinsic properties are relational to the internal times: an object is, say, straight at internal time t1 and bent at internal time t2.

Let’s go through the desiderata. The internal times are parts of the object, and a property obtaining in virtue of relations between one’s own parts can still be intrinsic. Shape, for instance, might be had in virtue of the spatial relationships between the parts of an object—and yet this does not rule out shape being intrinsic (indeed, for David Lewis it’s paradigmatically intrinsic). Similarly, consciousness properties in a split brain might be had relationally to a brain hemisphere, but are still intrinsic since brain hemispheres are parts of the patient. Thus we can have (1).

Moreover, while parts—namely, internal times—are used to account for change, the parts are not the primary bearers of the changeable intrinsic properties. The changeable intrinsic properties to be relational between the ordinary object and the times, but that does nothing to rule out the possibility that some of these properties are primarily had by the object as a whole.

Ordinary objects can be literally present at multiple times. One can ensure this either in an endurantist way, so that the ordinary objects are multiply temporally located 3D objects, or in a four-dimensionalist way, so that the ordinary objects are 4D. Note that the endurantist version may require the ordinary object to have parts—namely, the internal times—that do not themselves endure but that only exist for an external instant. But there is no problem with an enduring object having a short-lived part.

There is another variant of the view. The internal times could be taken to be abstract objects instead of parts of the ordinary object. Arguably, a property that is had in virtue of a relation to an abstract object is not thereby objectionably extrinsic. If it were, then strong Platonists would all count as denying the existence of intrinsic properties.