Consider this case, which a colleague tells me is standard. You are bleeding badly, and you need to get to the hospital. You get in your car. No ambulance is available. However, unbeknownst to you, your car's ignition is wired to a bomb. What should you, prudentially, do? Suppose you say "Don't go to the hospital, try to self-treat." Why would you say that? Well, it has better consequences than turning on the ignition. Call somebody who says this a "consequences externalist".
But what does it mean to say that it has better consequences than turning on the ignition? I suppose it's because something like this pair of conditionals is true:
- Were you to turn on the ignition, the bomb would explode and you'd die immediately.
- Were you not to turn on the ignition, you'd live longer.
If something like generalized standard Molinism (i.e., Molinism generalized to indeterministic stuff other than free will) is true, (1) and (2) are perfectly well defined. But suppose no such view is true. So, really, all we have at the time of the decision are objective probabilities: it is overwhelmingly likely, given the physical state of the world, that if you turn on the ignition, the bomb will explode and you'll die immediately, etc. So, it seems, the consequences externalist has to be deeming the conditionals true when the probabilities are high enough.
So, it seems, the consequences externalist is saying that you ought not to turn on the ignition because it is exceedingly likely, given the actual arrangement of the universe at the time of the action, that doing so will let you live longer, and it is exceedingly likely that turning on the ignition will not.
Fine. Now imagine that you in fact turn the ignition, the electrons quantum-tunnel around the bomb, and all is well (maybe eventually the bomb quantum-tunnels into the sun, too). This is exceedingly unlikely, but is compatible with everything in the story so far. According to the consequences externalist position I've sketched, you in fact did the wrong thing—even though it had better consequences than the alternative. You did the wrong thing, because at the time of the decision the objective probabilities were against this decision.
But to say that in this case you did the wrong thing goes against the guiding intuitions of the consequence externalist. Once you admit that you might have done the wrong thing even though it had the better consequences, you should probably just abandon the consequence externalism altogether, and move from objective to subjective probabilities.
Now, there is something the consequence externalist can say. She can say that we evaluate subjunctives by probabilities when their antecedents are false, and by consequents when the antecedents are true. This is messy, but not crazy. So, in the case I've described, (1) is false because it has a false consequent and true antecedent, but (2) is true because the objective probability of the consequent given the antecedent is low at the time of the action.
But if the consequence externalist says this, she has the following weird thing to say. She has to say that (a) turning on the ignition was in fact right, but (b) had you not turned on the ignition, turning on the ignition would have been wrong. Why does she have to say (b)? For if you had not turned on the ignition, the subjunctive conditional (1) would have been true. It would have been true because it would have had a false antecedent and hence would have to have been evaluated according to the objective probabilities.
So, oddly, you did the right thing, but had you not done it, it would have been the wrong thing. That is weird indeed.