Problem 1
Functionalism can best be seen as a response to the multiple realizability argument against physicalism: the same kinds of mental events can happen in carbon-based brains, silicon-based chips, plasma-based alien minds, etc., but if a belief that 2+2=4 is a particular configuration of neurons in a brain, then no critter without neurons could believe that 2+2=4. So the functionalist says that there is a functional isomorphism between states that all of these could have, and that's all that's needed for mental sameness. At the same time, the functionalist, unlike the behaviorist, is interested in lower-level functional states than just inputs and dispositions to outputs.
I am now thinking that functionalism is subject to a higher level multiple-realizability worry. Start with the intuition that the same computational results can be obtained through different, non-isomorphic algorithms (think of insertion sort and quick sort algorithms). Very plausibly, the same behavioristic states—relations between inputs and dispositions to outputs—can be obtained through different, non-isomorphic functional arrangements. Imagine, then, an alien being, a product of natural selection in a different environment from ours, that has a mind that is not functionally isomorphic to ourself, but where the alien is basically behavioristically isomorphic to us. Why wouldn't this be possible? (That question is not much of an argument, I know.) We would, I think, rightly assume that this being is in fact a person, and has beliefs, feelings, etc., despite the lack of functional isomorphism. But the functionalist must deny that such an alien would have beliefs, feelings, etc. since those kinds of states are defined by their functional connections, and the alien doesn't have those functional connections.
Epistemically, we seem to be behaviorists. Maybe this is only pragmatic—we don't want to mistreat someone who might turn out to be a person (this suggestion is due to Todd Buras). But suppose it's more than that. Then functionalism is in trouble, unless it can supply an argument that only systems that are (approximately?) functionally isomorphic to us could be (approximately?) behavioristically isomorphic to us.
The theistic dualist can do well here. Because of the great value in rational beings, there is good reason that God bestows souls on any natural kind of being whose behavior is sufficiently sophisticated to be compatible with a mental life.
Problem 2
Two states are functionally isomorphic provided that they give the same map between inputs and outputs. In particular, the states must be connected up with isomorphically corresponding modules. But it is very unlikely that a pain in a mouse and a pain in a human are functionally isomorphic. The pain in the human is probably connected into modules (such as higher level judgment modules) that have no corresponding modules in the mouse. Consequently, if functionalism holds, it is improbable that mice feel pain. The non-naturalist theist can get a nice disjunction out of this: Either functionalism is false (in which case, probably, some form of dualism is true, since I think that's the best alternative to functionalism), or else the problem of animal pain is not a problem at least in regard to less smart mammals.
God bestows souls on any natural kind of being whose behavior is sufficiently sophisticated to be compatible with a mental life.
ReplyDeleteI just find that way of putting it a bit odd. It makes souls seem a bit epiphenomenal to me. Would God give a sophisticated robot a soul? Isn't it more likely that God makes souls and then for some reason incarnates them in bodies that can support some of their functions in a material world?
Hi Alex,
ReplyDeleteProblem 1: would the functionalist attempt to explain behavior functionally as well? if so, does this have the consequence that the alien's behavior is not, after all, behavioristically isomorphic to us? obviously the functionalist does not want the denial of behavioral isomorphism to generate problems with multiple realizability at the first-order level, but I cant see that it would.
Problem 2: Peter Geach makes a similar move in his _Evil and Omnipotence_ (I think that's the title). Van Inwagen criticizes the move in his Gifford lectures. For the record, I think Geach is on to something.