Monday, February 1, 2016

A secondary brain and computational theories of consciousness

There is an urban myth that the Stegosaurus had a secondary brain to control its rear legs and tail. Even though it's a myth, such a thing could certainly happen. I want to explore this thought experiment:

At the base of my spine, I grow a secondary brain, a large tail, and an accelerometer like the one in my inner ear. Moreover, sensory data from the tail and the accelerometer is routed only to the secondary brain, not to my primary brain, and my primary brain cannot serve signals to the tail. The secondary brain eavesdrops on nerve signals between my legs and my primary brain, and based on these signals and accelerometer data it positions the tail in ways that improve my balance when I walk and run. The functioning of this secondary brain is very complex and are such as to suffice for consciousenss--say, of tactile and kinesthetic data from the tail and orientation data from the accelerometer--if computational theories of consciousness are correct.

What would happen if this happened? Here is an intuition:

  1. The thought experiment does not guarantee that I would be aware of data from the tail.
  1. If a computational theory of consciousness is correct, the thought experiment guarantees that something would be aware of data from the tail.
Suppose, then, a computational theory of consciousness is correct. Then it would be possible for the thought experiment to happen and for me to be unaware of data from the tail by (1). By (2), in this scenario, something other than me would be aware of data from the tail. What would this something other than me be? It seems that the best hypothesis is that it would be that it would be the secondary brain. But by parity, then, the subject of the consciousness of everything else should be the primary brain. But I am and would continue to be the subject of the consciousness of everything else and there is only one subject there. So I am a brain.

Thus, the thought experiment gives me a roundabout argument that:

  1. If a computational theory of consciousness is correct, I am a brain.
Of course (3) is fairly plausible apart from the thought experiment, but it's always nice to have another argument.

So what? Well:

  1. My hands are a part of me.
  2. My hands are not a part of any brain
  3. So, I am not a brain.
  4. So, computational theories of consciousness are false.

The thought experiment is still interesting even if computational theories of consciousness are false. If we had such secondary brains would we, or anything else, feel what's going on in the tail? I think it metaphysically go either way, depending on non-physical details of the setup.


Heath White said...

Your hypothesis may not be so far from reality.

Also, split brain patients may be able to have two centers of consciousness

Dagmara Lizlovs said...


All this makes me wonder that is if you are a brain or not a brain, if we took your brain out of you head and successfully put it into the head of a chimpanzee, and took the chimpanzee's brain out of the chimp's head and successfully put it in your head, what would we have then? Who now is you? :-)

Alexander R Pruss said...

I would be the recipient of a lot of chimpanzee organs.

Dagmara Lizlovs said...

And what would your students think?

Heath White said...

Upon further reflection, I think there is a lot more to reflect on.

1) It seems possible that one body could have multiple centers of consciousness (closely conjoined Siamese twins? split brain patients?).

2) What is being "aware of data"? There could be conscious awareness, and unconscious information processing; is the latter being aware? For there is all sorts of data from our bodies that our brains process, without us being consciously aware of it.

3) What is the criterion for "X is a part of me"? If it is something like, "I unconsciously process data from X" then there is no particular reason some body part could not be a part of multiple selves. If it is "I am consciously aware of data from X" then maybe (? why not?) that could be true of multiple selves as well. If it is "I can consciously control X" then possibly this could be true of only one self at a time, but then there are lots of parts of our bodies that are not part of ourselves (e.g. our digestive system). Moreover, there is no reason, in principle, that one self could not consciously control one aspect of a body part while another self controlled another aspect of it.

Alexander R Pruss said...


Agreed about 1. My contention, however, is that it is possible for me to be unaware of the stuff in this secondary brain. This intuition is akin to the kinds of thought experiments people give for dualism, like that it's possible to have a zombie, but I happen to have a stronger intuition here than I do about zombies. A second brain that happens to be a part of me just need not be something I am aware of.

Regarding 2, I am definitely thinking of conscious awareness.

Regarding 3, I don't think parthood can *reduce* to things like processing of data. It seems to me that either parthood is primitive or pretty close to being primitive (the latter is a bit vague). But processing data from X is *evidence* of X's being a part from you.

Heath White said...

I think a computational-theory-of-consciousness defender should begin this way: what we call a “self” is a center of consciousness, typically instantiated by a brain, bound up with a material body, in the sense that (i) the consciousness receives data from the body and (ii) the consciousness can control the body. The boundaries of awareness, control, and body are somewhat vague and, for contingent biological reasons, roughly overlapping, though in principle they can come apart. What a “self” is not, however, is some kind of simple or fundamental entity.

From that perspective, your initial thought experiment is one where there are two selves sharing a common body. This is going to scramble our intuitions a bit, but one could say that the selves are the centers of consciousness plus the parts of the body they control, or receive data from, or something along those lines. One need not say that the self is the brain.

And that perspective on selves also would defuse the second argument, from parthood.

I think what is really at issue here is a theory of personal identity, rather than a theory of consciousness per se.

P.S. Does the argument boil down to this?

Possibly, I have (or: one person has) two brains.
Not possibly, I (or: one person) have/contain/am two selves.
Therefore, my self is not my brain.