Friday, January 19, 2018

A quick argument against some materialisms

  1. Any pretty simple component of us can be replacement by a functionally equivalent prosthesis that isn’t a part of us without affecting our mental functioning.

  2. It is not possible to replace all our pretty simple components by prostheses that aren’t part of us without affecting our mental functioning.

  3. Hence, we are not wholly constituted by a finite number of pretty simple components.

This argument tells against all materialisms that compose us from pretty simple components. How simple is “pretty simple”? Well, simple enough that premise 1 be true. A neuron? Maybe. A molecule? Surely. It doesn’t, however, tell against materialisms that do not compose us from pretty simple components, such as a materialism on which we are modes of global fields.

8 comments:

Martin Cooke said...

The obvious problem is that while any neuron could in theory be replaced by a machine, the failure to be able to replace all of them could be due to how bulky our machines have to be.

Also I wonder what a prosthesis for a molecule could be ... ?

Wielka Miska said...

But isn't premise 2 exactly what (most) materialists oppose?

Alexander R Pruss said...

Great Bowl:

Surely not: for then *all* our material parts would be replaced with things that aren't parts of us. And so we'd have no material parts left. How could we exist, given materialism?

Martin:

You might have to embed us in a four- or five-dimensional space, and use machines that extend into the extra dimensions. Hard to do physically, but not hard metaphysically. :-)

Wielka Miska said...

Borrowing terms from computer science, we are (according to the materialists I've read, or at least how I understand them) programs that are being executed. It surely is possible to replace a computer piece-by-piece without stopping the program that's running on it. It's not even hypothetical - it's the industry standard in distributed computing, where a "program" spans multiple nodes/servers in a cluster. Cluster hardware gets upgraded (old machines are decommissioned, etc.), yet the program never stops.

Alexander R Pruss said...

1. Programs (whether as types or as instances) do not have legs as parts. I have legs as parts. Hence I am not a program.

2. It's only plausible to say that we are programs if one has a distinction between code and data, and includes code as part of the program. (We are not data. Why? Well, we compute. Data does not compute. It is computed with.) But if one includes code as part of the program, then changes of code result in a different program. Hence, if we are persons, education, which changes our code, results in a different person. Thus, education kills. But it doesn't. So we are not programs.

Wielka Miska said...

The line between code and data, or code and a computer, can get blurry in case of self-modifying machines/programs, but I get the point.

Anyway, the matter that our bodies are made of gets fully replaced once in a while. The goal of a prosthesis is to be functionally as close to the original part as possible. So, what's the difference between having our bodies gradually replaced by perfect prostheses, and having our body cells gradually replaced in a natural biological process?

But I'm no materialist, I just find a direct argumentation against materialism more convincing. I believe that the Church-Turing thesis is true (as long as we're within the material realm), and I also believe that a Turing machine can't generate consciousness. The reason for the latter is this: we understand exactly what consciousness is and can reason about it, yet we cannot give a satisfying definition of it (neither using a formal or non-formal language). E.g. saying that being conscious means being aware of self is just pushing the problem to the definition of awareness (does it just mean to have some data? if so, a book can have some data, but it's not conscious). This suggests that consciousness might be some kind of a basic, indivisible thing.

This actually seems to make my point even stronger than saying that consciousness isn't computable by a Turing machine: after all, the halting problem can't solved by a Turing machine either, yet it can be defined using a language - but I've just said that we cannot even define consciousness.

(I'm not a philosopher, so sorry if I'm just rehashing some old stuff.)

(I found out about you through Feser's books, and noticed that you might be Polish - cheers from Poland then!)

Alexander R Pruss said...

My argument does not need the claim that no prosthesis becomes a part of the person. It only needs the claim that one can replace every relatively simple part by a prosthesis that is not a part of the person. Not every prosthesis is a part of the person. For instance, suppose you replace one of my neurons by a simple WiFi transponder, which connects with the router in my house, which in turn connects with a server in the cloud that runs a neuron emulator. I don't think the server in the cloud has become a part of me. So, replace all the parts by prostheses like that.

What if someone responds: We understand exactly what pornography is and can reason about it, yet we cannot give a satisfying definition of it. But this does not suggest that pornography is some kind of a basic, indivisible thing? Sadly, we cannot give satisfying definitions of almost anything in real life. We can give a definition of the halting problem for a Turing machine, but we can't give a definition of the halting problem for a real computer. For we cannot give a definition of computation for a real computer. (To be honest, I haven't tried. But that's just my conjecture.)

Wielka Miska said...

It's hard to give an exact definition of pornography, but I think it is possible to give a satisfying one (that of course depends on what satisfies whom). Such definition would refer to more basic terms, like nudity, sex, and so on. Then we might get to the bottom of explanations or regress infinitely, but we have nonetheless described it using more basic terms. It might not be a perfect definition, but I feel that it still would grasp the core of it.

It seems to me that consciousness is somewhat different - I know exactly what it it, to the point where I have no problem with saying: thing A is consciousness, thing B isn't (A and B being anything). Yet any definition I can come up with pushes the whole concept of consciousness to some other term.