Friday, March 20, 2009

Identity theory of mind

Here is a quick, and no doubt well-known, argument that mental states are not token-token identical with brain states. The argument makes assumptions I reject, but they are assumptions that, I think, will be plausible to a materialist. The idea is this. It is possible to transfer my mind into a computer, while preserving at least some of my memories, and with my brain being totally destroyed in the process (I reject this myself, but I think the materialist should accept Strong AI, and her best bet for a theory of personal identity is some version of the memory theory, which should allow this). Were such a transfer to be made, then I would have some of the numerically same token mental states (e.g., a memory of a particular embarassing episode) as I do now. But if these mental state tokens are now identical with brain state tokens, then it follows that it is possible that some of my brain states can survive the destruction of my brain, without any miracle, just by means of technological manipulation. But no brain state token of the sort that is correlated with memories[note 1], can survive the destruction of the brain, perhaps barring a miracle.[note 2] Hence, the mental states are not identical with brain states.

Of course, one might try a four-dimensionalist solution, supposing some temporally extended entity that coincides with the brain state prior to the destruction of the brain and with the electronic state after the destruction of the brain. But that won't save identity theory—it will only yield the claim that the mental state is spatiotemporally coincident with a brain state, or constituted by the brain state, vel caetera.

Maybe, though, what the identity theorist needs to do is to disambiguate the notion of a "brain state". In one sense, a brain state is the state of the brain's being a certain way. Call that an "intrinsic brain state" (note: it may be somewhat relational—I am not worried about that issue). If identity theory is understood in this way, the above argument against the identity theory works (assuming materialism, etc.) But a different sense of "brain state" is: a state of the world which, right now, as a matter of fact obtains in virtue of how a brain is.

Thus, consider the following state S: Alex's brain being gray, or there being a war in France. State S now obtains in virtue of how my brain is. But state S obtained in 1940 in the absence of my brain, since I did not exist then; instead, it obtained in virtue of there being a war in France. The state S is now a brain state, though earlier it wasn't. Call such a thing a "jumpy brain state": it can jump in and out of heads.

The identity theorist who accepts the possibility of mind transfer had better not claim that mental state tokens are identical with intrinsic brain state tokens but rather must hold that they are identical with jumpy brain state tokens. Put that way, the identity theory is much tamer than one might have thought. In fact, it is not clear that it says anything beyond the claim that the present truthmakers for mental state attributions are brain states.

Also, consider this. Presumably, for any jumpy brain state S, there is an intrinsic brain state S*, which right now coincides with S, and which is such that S obtains in virtue of S*. Thus, corresponding to the jumpy state Alex's brain being gray, or there being a war in France, there is the intrinsic brain state Alex's brain being gray. There is now a sense in which our identity theory is not faithful to its founding intuition that mental states are the states that neuroscience studies. For neuroscience certainly does not study jumpy brain states (neuroscience as such is not about wars in France, or information on hard drives). Rather, neuroscience studies intrinsic brain states. The identity theorist's mental state is identical with some jumpy brain state S, but it is S* that neuroscience studies.

And so there is a sense in which the identity theory is a cheat, unless it is supplemented with a non-psychological theory of personal identity that bans mind transfer between brains and computers. But the latter supplementation will, I think, also ban AI, since if computers can be intelligent, minds can be transfered between computers (think of a networked computation—the data can move around the network freely), and it would be weird if they could be transfered between computers but not from a brain to an appropriately programmed computer. Moreover, once one bans AI, one has made a claim that intelligence requires a particular kind of physical substrate. And then it becomes difficult to justify the intuition that aliens with completely different biochemical constitution (even an electronic one—cf. the aliens in Retief's War) could have minds.

1 comment:

  1. Wow this is really cool. Thanks.

    BTW Prof. Pruss, I just discovered your blog and it's great. You discuss a lot of fascinating topics. I am a doctrinal candidate in math at Dartmouth so I'll hopefully have a PhD in math too, and I thus very much enjoy your analytic mind. God love you.

    ReplyDelete