Tuesday, December 6, 2022

Dividing up reasons

One might think that reasons for action are exhaustively and exclusively divided into the moral and the prudential. Here is a problem with this. Suppose that you have a spinner divided into red and green areas. If you spin it and it lands into red, something nice happens to you; if it lands on green, something nice happens to a deserving stranger. You clearly have reason to spin the spinner. But, assuming the division of reasons, your reason for spinning it is neither moral nor prudential.

So what should we say? One possibility is to say that there are only reasons of one type, say the moral. I find that attractive. Then benefits to yourself also give you moral reason to act, and so you simply have a moral reason to spin the spinner. Another possibility is to say that in addition to moral and prudential reasons there is some third class of “mixed” or “combination” reasons.

Objection: The chance p of the spinner landing on red is a prudential reason and the chance 1 − p of its landing on green is a moral reason. So you have two reasons, one moral and one prudential.

Response: That may be right in the simple case. But now imagine that the “red” set is a saturated nonmeasurable subset of the spinner edge, and the “green” set is also such. A saturated nonmeasurable subset has no reasonable probability assignment, not even a non-trivial range of probabilities like from 1/3 to 1/2 (at best we can assign it the full range from 0 to 1). Now the reason-giving strength of a chancy outcome is proportionate to the probability. But in the saturated nonmeasurable case, there is no probability, and hence no meaningful strength for the red-based reason or for the green-based reason. But there is a meaningful strength for the red-or-green moral-cum-prudential reason. The red-or-green-based reason hence does not reduce to two separate reasons, one moral and one prudential.

Now, one might have technical worries about saturated nonmeasurable sets figuring in decisions. I do. (E.g., see the Axiom of Choice chapter in my infinity book.) But now instead of supposing saturated nonmeasurable sets, suppose a case where an agent subjectively has literally no idea whether some event E will happen—has no probability assignment for E whatsoever, not even a ranged one (except for the full range from 0 to 1). The spinner landing on a set believed to be saturated nonmeasurable might be an example of such a case, but the case could be more humdrum—it’s just a case of extreme agnosticism. And now suppose that the agent is told that if they so opt, then they will get something nice on E and a deserving stranger will get something nice otherwise.

Final remark: The argument applies to any exclusive and exhaustive division of reasons into “simple” (i.e., non-combination) types.

No comments: