Alice and Bob are both bad people, and both believe in magic. Bob believes that he lives in an infinite universe, with infinitely many sentient beings. Alice thinks all the life there is is life on earth. They each perform a spell intended to cause severe pain to all sentient beings other than themselves.
There is a sense in which Bob does something infinitely worse than Alice: he tries to cause severe pain to infinitely many beings, while Alice is only trying to harm finitely many beings.
It is hard to judge Bob as an infinitely worse person than Alice, because we presume that if Alice thought that there were infinitely many sentient beings, she would have done as Bob did.
But even if we do not judge Bob as an infinitely worse person, shouldn’t we judge his action as infinitely worse? Yet even that doesn’t seem right. And neither seems to deserve that much more punishment than a sadistic dictator who tries to infect “mere millions” with a painful disease.
Could it be that punishment maxes out at some point?
4 comments:
At least when it comes to punishment of human beings, yes, and for the same reason that the 1 Billion only seems to the human mind a little bigger than 1 Million, even though it is much, much bigger. The punishment of a human being needs to be the sort of thing that the human mind, by its nature, can make sense of at some sort of intuitive level.
Presumably, with training (say, in mathematics), a human mind could get an intuitive sense of the difference between a million and a billion. Does that mean that the better mathematically trained mass-murderer should get a way bigger punishment?
This post sounds a bit too much like a the challenge posed by my high school chum when told by a jailer to keep quiet in his cell: "What are you gonna do? Double-lock me up?"
Maybe something kind of like that, though I’d have to think about it.
There seems to be some limit to our natural cognitive abilities when it comes to understanding the moral weight of our crimes. Call W the maximum moral weight we could, by nature, intuitively understand. There’s some punishment that is proportional to W, and I think that is some sort of maximum punishment for us.
One problem, though, is that, for lots of individuals max out at some value much less than W, and the reason they max out might wind up mattering. If they are maxing out because they have developed a poor character that makes it harder for them to understand the weight of their crimes, then maybe their punishment shouldn’t be less on that account.
Sometimes, one of the ways that an understanding of one’s own guilt winds up maxing out is by turning into despair, and I’m not at all sure what is going on there. Maybe that is another way of avoiding understanding the weight of one’s crimes.
Concerning trained mathematicians, I don’t think the kind of intuitions that they develop when learning how to manipulate the symbols that represent big numbers is likely to track the kinds of intuitions that would be needed in order to understand the weight of guilt.
Post a Comment