Showing posts with label identity theory. Show all posts
Showing posts with label identity theory. Show all posts

Wednesday, October 20, 2021

Is Lewis's identity theory a type-type identity theory?

David Lewis’s 1983 identity theory of mind holds that:

  1. For each mental state type M there is a causal role RM such that to be a state of type M is to fulfill RM.

  2. For each actually occurring mental state type M, the causal role RM is fulfilled by physical states and only physical states.

It is normal to take Lewis’s identity theory to be a type-type identity theory.

But a type-type identity theory identifies being a state of type M with some physical state type. So whether Lewis’s identity theory is a type-type identity theory depends then on whether fulfilling RM counts as a physical state type.

Here are two accounts of what makes a type T be a physical type:

  1. Everything falling under T is physical.

  2. Necessarily everything falling under T is physical.

If (3) is the right account of the physicality of a state type, then Lewis’s theory is a type-type identity theory, because everything that fulfills RM is physical according to (2).

However, (3) is an inadequate account of the physicality of a type. Consider the type ghost. That’s paradigmatically not a physical type. But in fact, trivially, everything that is a ghost is physical, simply because there are no ghosts. If one objects that only instantiated types count, then we can note that by (3) the type ghost-or-pig also counts as a physical type, whereas it surely does not.

It seems to me that (4) is a much better account of a physical type. However, on (4) for Lewis’s theory to count as a type-type identity theory, he would need a version of (2) strengthened by deleting “actually” and inserting “Necessarily” in front. And Lewis’s arguments do not establish such a stronger version of (2). Lewis’s arguments are quite compatible with RM having non-physical realizers in other possible worlds.

That said, perhaps (4) is not the right account of the physicality of a type either. Consider the type believed by God to be an electron. Necessarily, everything falling under this type is an electron, hence physical. But because the definition of the type makes use of supernaturalist vocabulary, the type does not seem to be physical. This criticism points towards an acocunt of physical type like this:

  1. The type T is expressible wholly in terms that natural science uses.

It’s essential for this to fit with Lewis’s theory that causation be one of the terms that natural science uses. But now imagine that we live in a world where one being causes spacetime, and it’s a non-physical being. Clearly, the type cause of spacetime is expressible wholly in natural scientific vocabulary, but given that the one and only instance of this type is non-physical, it sure doesn’t sound like a physical type! Indeed, if this (5) is how we understand physical types, then a type-type identity theory does not even imply a token-token identity theory!

We might try to combine (3) with (5):

  1. Everything falling under T is physical and the type T is expressible wholly in terms that natural science uses.

But now imagine that there is no being that causes spacetime and all spatiotemporal entities, but that it is possible for there to be such a being, and that any such being would necessarily be non-physical. In that case causes spacetime and all spatiotemporal entities satisfies (6) trivially, but is surely not a physical type, because the only possible instances of it would be non-physical. (If one objects that types need to be instantiated, just disjoin this type with the type pig, as we did in the ghost case.)

So perhaps our best bet is to combine (4) with (5). But any account on which (4) is a necessary condition for the physicality of a type is an account that goes beyond Lewis’s, because it requires the stronger version of (2) with actuality replaced by necessity.

I conclude that Lewis’s account isn’t really a type-type identity theory, except in the inadequate senses of physicality of type given by (3), (5) or (6).

Friday, March 20, 2009

Identity theory of mind

Here is a quick, and no doubt well-known, argument that mental states are not token-token identical with brain states. The argument makes assumptions I reject, but they are assumptions that, I think, will be plausible to a materialist. The idea is this. It is possible to transfer my mind into a computer, while preserving at least some of my memories, and with my brain being totally destroyed in the process (I reject this myself, but I think the materialist should accept Strong AI, and her best bet for a theory of personal identity is some version of the memory theory, which should allow this). Were such a transfer to be made, then I would have some of the numerically same token mental states (e.g., a memory of a particular embarassing episode) as I do now. But if these mental state tokens are now identical with brain state tokens, then it follows that it is possible that some of my brain states can survive the destruction of my brain, without any miracle, just by means of technological manipulation. But no brain state token of the sort that is correlated with memories[note 1], can survive the destruction of the brain, perhaps barring a miracle.[note 2] Hence, the mental states are not identical with brain states.

Of course, one might try a four-dimensionalist solution, supposing some temporally extended entity that coincides with the brain state prior to the destruction of the brain and with the electronic state after the destruction of the brain. But that won't save identity theory—it will only yield the claim that the mental state is spatiotemporally coincident with a brain state, or constituted by the brain state, vel caetera.

Maybe, though, what the identity theorist needs to do is to disambiguate the notion of a "brain state". In one sense, a brain state is the state of the brain's being a certain way. Call that an "intrinsic brain state" (note: it may be somewhat relational—I am not worried about that issue). If identity theory is understood in this way, the above argument against the identity theory works (assuming materialism, etc.) But a different sense of "brain state" is: a state of the world which, right now, as a matter of fact obtains in virtue of how a brain is.

Thus, consider the following state S: Alex's brain being gray, or there being a war in France. State S now obtains in virtue of how my brain is. But state S obtained in 1940 in the absence of my brain, since I did not exist then; instead, it obtained in virtue of there being a war in France. The state S is now a brain state, though earlier it wasn't. Call such a thing a "jumpy brain state": it can jump in and out of heads.

The identity theorist who accepts the possibility of mind transfer had better not claim that mental state tokens are identical with intrinsic brain state tokens but rather must hold that they are identical with jumpy brain state tokens. Put that way, the identity theory is much tamer than one might have thought. In fact, it is not clear that it says anything beyond the claim that the present truthmakers for mental state attributions are brain states.

Also, consider this. Presumably, for any jumpy brain state S, there is an intrinsic brain state S*, which right now coincides with S, and which is such that S obtains in virtue of S*. Thus, corresponding to the jumpy state Alex's brain being gray, or there being a war in France, there is the intrinsic brain state Alex's brain being gray. There is now a sense in which our identity theory is not faithful to its founding intuition that mental states are the states that neuroscience studies. For neuroscience certainly does not study jumpy brain states (neuroscience as such is not about wars in France, or information on hard drives). Rather, neuroscience studies intrinsic brain states. The identity theorist's mental state is identical with some jumpy brain state S, but it is S* that neuroscience studies.

And so there is a sense in which the identity theory is a cheat, unless it is supplemented with a non-psychological theory of personal identity that bans mind transfer between brains and computers. But the latter supplementation will, I think, also ban AI, since if computers can be intelligent, minds can be transfered between computers (think of a networked computation—the data can move around the network freely), and it would be weird if they could be transfered between computers but not from a brain to an appropriately programmed computer. Moreover, once one bans AI, one has made a claim that intelligence requires a particular kind of physical substrate. And then it becomes difficult to justify the intuition that aliens with completely different biochemical constitution (even an electronic one—cf. the aliens in Retief's War) could have minds.