Wednesday, April 2, 2014

Functionalism, biological antireductionism and dualism

According to functionalism, a mental state such as a pain is characterized by its causal roles. But if one physical state plays the causal role of pain, so do many others and so the characterization fails. For instance, if neural state N plays the causal role of pain in me, so does the conjunction of N with my having blue eyes. One could require minimality of the state, but that won't help. First, plausibly, there is no minimal state that plays the role: if a state plays it, so does that state minus a particle. Second, even if there is one, it is very unlikely to be unique. There is likely to be redundancy, and there will be many ways of getting rid of redundancy.

The solution to this problem in the spirit of Lewisian functionalism is to restrict one's quantifiers to natural states. There are two ways of doing this. First, we could restrict the quantifiers to states which are sufficiently natural, whose degree of unnaturalness is below some threshold. (An obvious way to measure unnaturalness is to measure the length of the shortest linguistic expression taht expresses the state in terms that are perfectly natural.) But this is unlikely to work. If mental states have degreed unnaturalness, presumably there will be a lot of variation in the degree of unnaturalness. Some mental states will, for instance fall far below the threshold. Those states could then be made slightly more complicated while still staying below the threshold, so once again we would have a problem.

So we better restrict quantifiers to perfectly natural states, at least in the case of the basic mental states (or maybe protomental states—I won't distinguish these) out of which more complex ones are built. Thus we have our first conclusion:

  1. If functionalism is true, basic mental states are perfectly natural.
This has an interesting corollary. Presumably no macroscopic state of a purely physical computer is perfectly natural. Thus:
  1. If functionalism is true, a purely physical computer has no basic mental states, and hence no mental states.
Thus, the only way a computer could have mental states is if it wasn't purely physical (Richard Swinburne once suggested to me that if a computer had the right functional complexity, God could create a soul for it.)

What about organisms? Well, if organisms are purely physical, then their mental states will be biological states (subject to evolution and the like). So:

  1. If functionalism is true, then some of the biological states of a minded purely physical organism are perfectly natural.
This is an antireductionist conclusion. Thus,
  1. Functionalism implies that all minded organisms have non-physical states (dualism) or some minded organisms have perfectly natural biological states (antireductionism) (or both).
Moreover, our best account of naturalness is that it is fundamentality. If that is the right account, then our antireductionism is pretty strong: it says that some biological states are fundamental.

Moreover, functionalism is the only tenable version of physicalism (I say). Thus:

  1. Physicalism implies biological antireductionism.

5 comments:

Brian Cutter said...

The distinction between (Lewisian) filler-functionalism and role-functionalism seems important here. According to the first, our pain concept is a priori equivalent to a description like "the state that plays role R," and so pain is whatever first-order state it is that satisfies this description. According to the second, pain is identical to the second-order state of being in some state or other that plays role R. It seems that your arguments apply only to filler-functionalism, not to role-functionalism. If I am currently undergoing many states that play role R, then I cannot be said to be in *the* state that plays role R, but I can be said to be in some state or other that plays role R.

(My impression is that role functionalism is the more popular form of functionalism. Two advantages of role-functionalism: (i) it can accommodate multiple realizability without claiming (as Lewis does) that "pain" is non-rigid and that what state pain is is species relative. (ii) It needn't endorse the implausible claim that "pain" is a priori equivalent to any functional description, so it can recognize the conceivability of zombies. On the other hand, two disadvantages: (i) As Brian McLaughlin has emphasized, role-functionalism has trouble securing mental causation. (ii) Role-functional states are non-categorical, whereas pain and other phenomenal states seem to be paradigmatically categorical.)

Alexander R Pruss said...

So maybe the physicalist functionalist can say that what I have is another argument for role functionalism.

But I wonder if a modified version of my argument couldn't be run against role functionalism.

First, it is at least somewhat plausible on that the number of pains one has is equal to the number of states one has that play a pain role. But then we have way more pains than it seems!

Second, and more importantly, I suspect that if one allows oneself very unnatural states, one can create states that play role R even when one doesn't have the mental state associated with R.

A pain state can be finked in such a way that the organism does not satisfy any of the global counterfactuals--i.e., counterfactuals about the behavior of the organism as a whole, rather than of neural subsystems--that are normally associated with pain. A plausible functionalism needs to accept this possibility, and so it shouldn't be based on global counterfactuals, but on the functional interconnections between neural subsystems. Otherwise, we are too close to behaviorism.

So now consider two organisms, A and B, with the same global counterfactuals, but with A being in finked pain and B not being in pain. I suspect--here is where detail work is needed--that if one allows oneself really unnatural states and the addition or subtraction of finks for them, one will be able to find an isomorph of A's mental system in B. And then on role-functionalism, B will be in pain, contrary to the assumption.

Brian Cutter said...

Re: the first point: I had a similar concern, but I suppose there's not too much cost in simply denying the claim. The claim doesn't intuitively seem to be a necessary truth, for if it did, zombies wouldn't seem possible. But they do. (Of course, this is closely related to the counterintuitiveness of functionalism itself.)

Re: the second point (this also applies to the original argument, to some extent): A lot of complicated issues here, but one quick point. There is some independent plausibility to the claim that in order for a state to be eligible to figure in causal relations, it must meet some minimum threshhold of naturalness. (E.g. I put on a jacket because I was cold, not because I was cold-or-hungry-or-popular.) Neo-Humean views of laws usually impose a similar requirement on the properties that figure in laws, so this condition might follow if there is a tight connection between laws and causation. Or on counterfactual theories, we might say that x's being F causes y's being G iff (very roughly!) (i) these events are distinct, (ii) the latter counterfactually depends on the former, and (iii) F (and maybe also G) is natural to at least degree x. (We might also add a Yablo-esque proportionality constraint, which might further reduce the proliferation of role-occupants.)

Alexander R Pruss said...

1. Regarding the second point, imagine that while B is not in pain, B is very close to being in pain. It may turn out that a state that's only a little less natural than A's pain state will realize the pain role in B. I am not sure how to work out the details, especially since all this stuff about the roles is pie in the sky (we have nothing like a detailed picture of what the roles are, besides handwaving things about pains causing aversive behavior, etc.)

2. Imagine a black box with eleven buttons, labeled 0 through 9, on it and a display. The box functions as follows. When you do two button presses in sequence, it displays the product of the button numbers. If you button 3 without having pressed anything first, it is in a state that realizer a "storing 3 for future calculation" role: call this role S3, and call the realizer R3.

(*) Then, if the box is in R3, and you press button 5, R3 + pressing button 5 causes the display to show 15.

Notice that (*) is true no matter how complex the machinery in the box is, and no matter how R3 is implemented in the box. The box might be a Chinese room or even all of China.

This means that the states that realize functional roles can be arbitrarily complex in their inner structure, and still play causal roles.

But if unnaturalness is measured by the length of the shortest description in a language all of whose terms refer to perfectly natural objects and relations, then R3 will be *very* unnatural.

Alexander R Pruss said...

Might be better to have only ten buttons if they're labeled 0 through 9!