Tuesday, April 23, 2024

Value and aptness for moral concern

In two recent posts (this and this) I argued that dignity does not arise from value.

I think the general point here goes beyond value. Some entities are more apt for being morally concerned about than others. These entities are more appropriate beneficiaries of our actions, we have more reason to protect them, and so on. The degreed property these entities have more of has no name, but I will call it “apmoc”: aptness for moral concern. Dignity is then a particularly exalted version of apmoc.

Apmoc as such is agent-relative. If you and I have cats, then my cat has more apmoc relative to me than your cat, while your cat has more apmoc relative to you. Thus, I should have more moral concern for my cat and you for yours. Agent-relativity can be responsible for the bulk of the apmoc in the case of some entities—though probably not in the case of entities whose apmoc rises to the level of dignity.

However, we can distinguish an agent-independent core to an entity’s apmoc, which I will call the entity’s “core apmoc”. One can think of the core apmoc as the apmoc the entity has relative to an agent who has no special relationship to the entity. (Note: My concern in this post is the apmoc relative to human agents, so the core apmoc may still be relative to the human species.)

Now, then, here is a thesis that initially sounds good, but I think is quite mistaken:

  1. An entity’s core apmoc is proportional to its value.

For suppose I have two pet dragons, on par with respect to all properties, except one can naturally fly and the other is naturally flightless. The flying dragon has more value: it is a snazzier kind of being, having an additional causal power. Both dragons equally like being scratched under the chin (perhaps with a rake). The fact that the flying dragon has more value does not give me any additional reason to scratch it. More generally, the flying dragon does not have any more core apmoc.

One might object: if it is a matter of saving the life of one of the dragons, other things being equal, one should save the life of the flying dragon, because it is a better kind of being. However, even if this judgment is correct, it is not due to a difference in apmoc. If the flying dragon dies, more value is lost. The death of a dragon removes from the world all the goods of the dragon: its majestic beauty, its contribution to winter heating, its protection of the owner, its prevention of sheep overpopulation, and so on. The death of the flying dragon removes a good—an instance of the causal power of flight—from the world which the death of the flightless dragon does not. If the reason one should save the life of the flying dragon over the flightless one is that the flying one is a better kind of being, then the reason one is saving its life is not because the flying dragon has more apmoc, but because more is lost by its death. If I have a choice of saving Alice from losing a thumb or Bob from losing the little toe, I should save Alice from losing a thumb, not because Alice has more apmoc, but because a thumb is a bigger loss than a toe.

The above objection points out one feature. Sometimes bestowing what is in some sense “the same benefit” to entity will actually bestow a benefit proportional to the value of the entity. Saving an entity from destruction sounds like “the same benefit”, but is a greater benefit where there is more value to be saved. Similarly, if I have a choice between fixing a tire puncture in my car or in my bike, more value is gained when I fix the car’s tire, because the car is more valuable. However, this is not due to the car having more apmoc, but simply because the benefits are different: if I fix the car’s tire, the car would become capable of transporting around my whole family, while the bike would only become capable of transporting me.

Let’s move away from fantasy. Suppose Alice and Bob are on par in all respects, except that Alice knows the 789th digit of π while Bob does not. Knowledge is valuable, and so if you have more knowledge, you have more value. But now if I have a choice of whom to give a delicious chocolate-chip muffin, the fact that Alice knows the 789th digit of π is irrelevant—it contributes (slightly) to value but not at all to core apmoc (it might contribute to the agent-relative aspects of apmoc in some special cases, since shared knowledge can be a partial constituent of a morally relevant relationshiop).

Granted, a piece of knowledge is a contingent contribution to value. One might think that core apmoc is determined proportionately to the essential values of an entity. But I think this is implausible. Most people have the intuition that, other things being equal, a virtuous person has more apmoc than a vicious one. But virtue is not an essential value—it is a value that fluctuates over a lifetime.

The case of virtue and vice suggests that there may be some values that contribute to core apmoc. I think this is likely. Core apmoc does not appear in a vacuum. But the connection between apmoc and value is complex, and the two are quite different.

3 comments:

SMatthewStolte said...

Why assume that apmoc is a degreed property rather than saying that, in various contexts, there is an ordering of priorities that ought to be had? It might turn out that you ought to prioritize saving the life of the flying dragon over the flightless dragon, but this does not imply (so far as I can tell) that the apmoc properties grounding this duty are quantifiable. In other words, I don’t see any reason to assume that there is some value R such that the apmoc of the flying dragon is equal to R times the apmoc of the flightless dragon.

Alexander R Pruss said...

By "degreed", I didn't mean that it was quantifiable. I just meant there is a more and a less. Artistic excellence is degreed, but these degrees are not quantifiable. I don't think we disagree.

SMatthewStolte said...

Yeah, it looks like we agree. I was just reading too much into the word, possibly as a result of unfamiliarity with the literature on the subject.