Tuesday, July 14, 2015

Vague mental states

I've been thinking through my intuitions about vagueness and mental states, especially conscious ones. It certainly seems natural to say that it can be vague whether you are in pain or itching, or that it can be vague whether you are sure of something or merely believe it strongly. But I find very plausible the following mental non-vagueness principle:

  • (MNV) Let M be a maximally determinate mental state. Then it cannot be vague whether I am in M.
MNV is compatible with the above judgments. For if I am in a borderline case between pain and itch, it is not vague that I have the maximally determinate unpleasant conscious state U that I have. Rather, what is vague is whether U is a pain or an itch. Intuitively, this is not a case of ontological vagueness, but simply of how to classify U. Similarly, if I am borderline between sureness and strong belief, there is a maximally determinate doxastic state D that I have, and I have it definitely. But it's vague whether this state is classified as sureness or strong belief.

Interestingly, though, MNV is strong enough to rule out a number of popular theories.

The first family of theories ruled out by MNV is just about any theory of diachronic personal identity that allows personal identity to be vague. Psychological continuity theories, for instance, are going to have to make personal identity be vague (on pain of having a very implausible cut-off). More generally, I suspect any theory of personal identity compatible with reductive materialism will make personal identity be vague. But suppose it's vague whether I am identical with person B who exists at a later time t. Then likely B has, and surely could have, a maximally determinate mental state M at t that definitely nobody else has at that time. Then if it's vague at t whether I am B, it's vague at t whether I have M, contrary to MNV.

I suppose one could weaken MNV to say that it's not vague whether something is in M. I would resist this weakening, but even the weakened MNV will be sufficiently strong to rule out typical (i.e., non-Aristotelian) functionalist theories of mind. For suppose that my present maximally determinate mental state M is constituted by computational state C. But now imagine a sequence of possible worlds, starting with the actual, and moving to worlds where my brain is more and more gerrymandered. Just replace bits of my brain by less and less natural prosthetics, in such ways that it becomes more and more difficult to interpret my brain as computing C. (For instance, at some point whether something counts as a computational state may depend on whether it's raining on a far away planet.) Suppose also nothing else computing C is introduced. Then there will be a continuum of worlds, at one end of which there is computation of C and at the other of which there isn't. But it would be arbitrary to have a cut-off as to where M is exemplified. So it's vague whether M is exemplified in some of these worlds, contrary to MNV.

No comments: