Suppose soft determinism is true: the world is deterministic and yet we are responsible for our actions.
Now imagine a device that can be activated at a time when an agent is about to make a decision. The device reads the agent’s mind, figures out which action the agent is determined to choose, and then modifies the agent’s mind so the agent doesn’t make any decision but is instead compelled to perform the very action that they would otherwise have chosen. Call the device the Forcer.
Suppose you are about to make a difficult choice between posting a slanderous anonymous accusation about an enemy of yours that will go viral and ruin his life and not posting it. It is known that once the message is posted, there will be no way to undo the bad effects. Neither you nor I know how you will choose. I now activate the Forcer on you, and it makes you post the slander. Your enemy’s life is ruined. But you are not responsible for ruining it, because you didn’t choose to ruin it. You didn’t choose anything. The Forcer made you do it. Granted, you would have done it anyway. So it seems you have just had a rather marvelous piece of luck: you avoided culpability for a grave wrong and your enemy’s life is irreparably ruined.
What about me? Am I responsible for ruining your enemy’s life? Well, first, I did not know that my activation of the Forcer would cause this ruin. And, second, I knew that my activation of the Forcer would make no difference to your enemy: she would have been ruined given the activation if and only if she would have been ruined without it. So it seems that I, too, have escaped responsibility for ruining your enemy’s life. I am, however, culpable for infringing on your autonomy. However, given how glad you are of your enemy’s life being ruined with your having any culpability, no doubt you will forgive me.
Now imagine instead that you activated the Forcer on yourself, and it made you post the slander. Then for exactly the same reasons as before, you aren’t culpable for ruining your enemy’s life. For you didn’t choose to post the slander. And you didn’t know that activating the Forcer would cause this ruin, while you did know that the activation wouldn’t make any difference to your enemy—the effect of activating the Forcer on yourself would not affect whether the message would be posted. Moreover, the charge of infringing on autonomy has much less force when you activated the Forcer yourself.
It is true that by activating the Forcer you lost something: you lost the possibility of being praiseworthy for choosing not to post the slander. But that’s a loss that you might judge worthwhile.
So, given soft determinism, it is in principle possible to avoid culpability while still getting the exact same results whenever you don’t know prior to deliberation how you will choose. This seems absurd, and the absurdity gives us a reason to reject the compatibility of determinism and responsibility.
But the above story can be changed to worry libertarians, too. Suppose the Forcer reads off its patient’s mind the probabilities (i.e., chances) of the various choices, and then randomly selects an action with the probabilities of the various options exactly the same as the patient would have had. Then in acting the Forcer, it can still be true that you didn’t know how things would turn out. And while there is no longer a guarantee that things would turn out with the Forcer as they would have without it, it is true that activating the Forcer doesn’t affect the probabilities of the various actions. In particular, in the cases above, activating the Forcer does nothing to make it more likely that your enemy would be slandered. So it seems that once again activating the Forcer on yourself is a successful way of avoiding responsibility.
But while that is true, it is also true that if libertarianism is true, regular activation of the Forcer will change the shape of one’s life, because there is no guarantee that the Forcer will decide just like you would have decided. So while on the soft determinist story, regular use of the Forcer lets one get exactly the same outcome as one would otherwise have had, on the libertarian version, that is no longer true. Regular use of the Forcer on libertarianism should be scary—for it is only a matter of chance what outcome will happen. But on compatibilism, we have a guarantee that use of the Forcer won’t change what action one does. (Granted, one may worry that regular use of the Forcer will change one’s desires in ways that are bad for one. If we are worried about that, we can suppose that the Forcer erases one’s memory of using it. That has the disadvantage that one may feel guilty when one isn’t.)
I don’t know that libertarians are wholly off the hook. Just as the Forcer thought experiment makes it implausible to think that responsibility is compatible with determinism, it also makes it implausible to think that responsibility is compatible with there being precise objective chances of what choices one will make. So perhaps the libertarian would do well to adopt the view that there are no precise objective chances of choices (though there might be imprecise ones).
Alex:
ReplyDeleteI have a few objections:
1. Even if the world is deterministic, it does not follow that we humans can read minds like that, with any accuracy greater than we could if the world is not deterministic.
2. Leaving 1. aside, if the agent is determined to choose X, it is not possible that the device changes that (because it's already determined), so at most the device can read what the agent intends to do at the point the device reads her mind. In fact, if the device forced her to do X, she was determined to be forced to X by the device (and that was determined before the mind reading). At most, you might say the device forced her to do what she would have been determined to do had the device not existed. But that's also not clear, because she might have changed her mind later - determinism does not preclude last-minute changes of heart or even make them less probable, as long as those too are determined. So, your activation of the forcer might make a difference. It might force a person to do what they would otherwise not have done. The question is how probable it is that it will make a difference, but that does not require determinism. On that note, there are cases when we can tell that a person will immorally harm another, regardless of whether this world happens to be deterministic.
As for your now knowing and whether you're guilty, I would say that:
First, if the device can read what a person will do if not forced, then it seems to me you can use the device to know what they'll do if not forced, and so what the device will do (if not stopped by someone else by force, perhaps, or something like that). Your choice not to know what the dangerous robot you unless will do is still a choice you're guilty of.
Second, even without a device, you can make an assessment on the basis of the other person's behavior. And sometimes a person does know that someone else is going to immorally hurt another person's reputation, by means of false allegations, etc. So, you may have known anyway. But moreover, even if you did not know for certain in this particular case, and for some odd reason you find a way out of the previous point, you're guilty for placing the person's reputation, etc., at a very high risk of being unjustly destroyed by your actions, if you reckoned that the probability of that was high, even if not so high that it was beyond a reasonable doubt. On the other hand, if you properly reckoned the probability was not very high (or you just didn't bother assessing it), and you go around doing that sort of thing (i.e., to more than one person), then you're both guilty of taking away people's freedom for no good reason, and guilty of placing others at a significant risk of being unjustly harmed by your actions. But if you do it to a single person without making a probabilistic assessment that they would probably go through with it, then you might or might not get that result. You might or might not be taking away the freedom of someone who wouldn't behave immorally if you did not.
Regarding whether they would forgive you, they might, or they might not. Personally, if someone were to do that to me and take away my freedom, I would not at all be inclined to do so. In fact, I would consider that a very serious offense, and I would hope that whoever did that to me gets a long time in prison.
ReplyDeleteNow, you might use your device to read the minds of many people until you find one who would forgive you, but then, many others probably would not forgive you for reading their minds against their will with your device (does the device look at brains, or what does it look at?).
Granted, that might not apply to self-use. But in that case, a person often knows what she'll do, regardless of whether the world is deterministic.
On libertarianism, you say: "But while that is true, it is also true that if libertarianism is true, regular activation of the Forcer will change the shape of one’s life, because there is no guarantee that the Forcer will decide just like you would have decided. "
That is true, but I would argue that same may happen on determinism. While you might set up a deterministic world in which that is not the case, there is no good reason to believe that if our world is deterministic, it is like that. It very probably is not, since the device will act on incomplete information, because it's supposed to read a mind in a way that does not destroy the brain, does not restrain the person, etc., and that very likely will not render anything like a 100% predictive accuracy.
Granted, sometimes we know what people will do, and so sometimes there is something close to a probability 1, even without using devices. But that's regardless of whether determinism is actually true. In fact, when we're talking about self-use, I would say that that may well be often the case.
On that note: "So perhaps the libertarian would do well to adopt the view that there are no precise objective chances of choices (though there might be imprecise ones)."
I'd say it's about epistemic probability (objective too, in the sense that there is an objective fact of the matter as to whether the proper epistemic probabilistic assessment is), and in any case, imprecision won't help, because that does not prevent things from being so probable that they're beyond a reasonable doubt.
For example, I know that I will post this post I'm writing. This is regardless of whether or not there are precise chances, whether the world is deterministic, etc. - for example, I don't know whether the world is deterministic, but I do know I will post (it's beyond a reasonable doubt).
I have a tough time with this Forcer business. We can be tempted to do something that is nasty and bad to some one. Our own personal justifications can act as a Forcer. I still say we can resist. Sometimes resisting requires all we've got. We can do things that diminish our ability to resist. It's been a long day so I am grasping at the words and line of thought.
ReplyDeleteThis whole Forcer business became quite tereifyingly real to me several weeks ago. We had a shooting in Great Mills High School which is around the corner from where I live. A 17 year old boy fataly shot his ex girlfriend and then fataly shot himself at the same time the school resource officer shot him. To make matters more personal, I know the shooter's father. The kid stole his father's Glock to kill his ex girlfriend. People say he just snapped as if snapping was the Forcer here. I say BS. People don't just snap suddenly out of the blue. This was brewing for a while. A Forcer cannot act unless we surrender bits and pieces of our free wills first.
People want to remove responsibility from this kid by saying he just snapped. The breakup did it to him. As if that was an irristable Forcer here. I say that is BS. The kid knew it waa wrong. It waa premeditated. Every step of the way he could have stopped right up to the trigger pull. However, he weakened himself with each step. It is hard to write things right now, but some one has to say something that if we put personal responsibility back in, then we drive the Forcer out of the equation.
ReplyDeleteA good counter force to the Forcer is a proper fear of God's judgement and a proper fear of Hell. You cannot force a horse to do anything if it is really scared.
ReplyDelete