Tuesday, November 21, 2017

Omniscience

A standard definition of omniscience is:

  • x is omniscient if and only if x knows all truths and does not believe anything but truths.

But knowing all truths and not believing anything but truths is not good enough for omniscience. One can know a proposition without being certain of it, assigning a credence less than 1 to it. But surely such knowledge is not good enough for omniscience. So we need to say: “knows all truths with absolute certainty”.

I wonder if this is good enough. I am a bit worried that maybe one can know all the truths in a given subject area but not understand how they fit together—knowing a proposition about how they fit together might not be good enough for this understanding.

Anyway, it’s kind of interesting that even apart from open theist considerations, omniscience isn’t quite as cut and dried as one might think.

27 comments:

Justin said...

Interesting. Another wrinkle, which you are probably aware of, is non-propositional knowledge. In particular, I think of Zagzebski on omnisubjectivity and Wierenga (among others) on omniscience and knowledge de se. Occasionally people express worries about knowing-how as well, but I don't know who has written extensively on omniscience and knowing-how (I'm sure someone has).

Martin Cooke said...

If you know all propositions, and in particular p, and also that you are omniscient, then surely you would be sure about p. (But would the best omniscience really be absolute certainty? Do we really need to assume that God's Self-awareness somehow (how?) includes knowledge of the non-existence of anything not grounded in His Self?)

If you know all propositions then you know all the propositions about how they fit together. (There may be additional non-propositional feelings about how they fit together, that contribute to understanding how they fit together, related to what Justin said.)

Alexander R Pruss said...

Why would you be sure? Imagine this story. A one-million-sided die was tossed yesterday. If it came up 1, your brain was put in a vat and brainwashed into thinking you are omniscient. If it came up anything else, God gave you knowledge of every true proposition. So, now, you think: Probably I know everything. But you're not sure. There is a one in a million chance that you're a brainwashed brain in a vat.

I don't know if knowing how the propositions fit together is enough for understanding. It may be that to understand one not only has to know how they fit together and *attend to that knowledge* in the right way.

Martin Cooke said...

Thank you for your thoughts Alex: I guess that I would be sure because I would also know how I was omniscient and why I was omniscient and so forth, and hence would be able to rule out such scenarios as that die.

I think it is the meaning of "know" in this context: If I know that I am omniscient then I can rule out such scenarios. By contrast, if I know what the capital city of France is, then I cannot (although if I know the capital city, I may well be more sure that I know its name). So "knows all truths" is not obviously good enough, but noticing that it includes "knows that one is omniscient" means that it is. Analogously, the meaning of "know" in mathematics, analytic logic, etc.

I think that there must be more to understanding, similarly: what "understanding" means depends on what is being understood.

Martin Cooke said...

...what I should have said was that if I know all propositions then I know the falsity of every skeptical scenario (in particular, I would know that the die did not come up 1)

Alexander R Pruss said...

Sure, you can *know* that the sceptical scenario is false. But that's not enough to know for *certain* that it is false.

I was assuming in my example that the fact that the chance of its coming up 1 was so tiny was good enough to know that it didn't come up 1. If you think one doesn't know when the chance of falsehood is only one in a million, then just increase the number of sides.

Martin Cooke said...

How do I not know for certain that it is false? Certainty is either a feeling of confidence or else some objective quality of being reliably true. The objectivity comes from actually being omniscient. The feeling of confidence comes from knowing that every skeptical scenario is false. You said: "So, now, you think: Probably I know everything. But you're not sure. There is a one in a million chance that you're a brainwashed brain in a vat." But in fact I think: Certainly I know everything; why would I not be sure? Not only do I know that there is not that one in a million chance, because I know what the number actually was, I also know the falsity of every possible skeptical scenario, including all those that seek to undermine my knowledge of that number, and so on." But of course, why should I not just feel confident? I am omniscient, and I know that I am!

Increasing the number of sides does nothing to my argument; and putting numbers to it is not in general a good idea: One can know proposition p, and also know proposition q, and one thereby knows p and q. One can then treat the Preface paradox as a Sorites; adding numbers seems to solve the preface paradox, but really? You know p and you know q but you do not know p and q? That is a weak sort of knowledge, unfit for ordinary thought, let alone scientific thought!

Martin Cooke said...

Btw you say "that having a credence very close to one does not differ much from believing" (in a comment on Open Theism and Divine Perfection, another of November's posts) but what about this very-many-sided die that you mentioned in your previous comment here: You believe that it is very (very) unlikely to land with 1 uppermost, but do you really believe that it will not so land? There are a range of bets you would make, based on your credence, some of which are just what you would do if you did believe, but that is not much to base "does not differ much from" on. There are lots of examples of why not: your weight is very (very) close to some pile of poo, and so there are a range of actions that might be similar (especially if we include actions that are not for you normal (and I suppose that you do not do a lot of betting

Martin Cooke said...

...I guess my point in that last example of mine is that some people treat other people as though they are little more than masses, or little more than animals, and the wider economy might make most of us act as though we thought of other people as nothing more than economic entities; we do this all the time, most of us. And similarly, we ignore possibilities that are very unlikely, but which we cannot rule out, all the time. It is as though having a credence very close to 1 is the same as believing, and so they seem to differ little. But your very-many-sided die is actually an excellent example of where they do differ. You only need a very big bet to highlight the difference. And all the other bets are like all the things that your weight in poo has in common with you: it hardly matters how many of them there are, not to the magnitude of the difference. (btw I admired how neat that "OT and DP" argument was, but regarding your intuition it reminds me of the intuition that having your officers deny you while you are crucified is an imperfection in a leader of men ;-)

Martin Cooke said...

Of course, you are right: one could know all propositions but not with certainty. One would not have to put it all together, as I have been assuming. One would not have to be perfectly rational. But then, if we can have omniscience without perfect rationality, I wonder why you think that we cannot have omniscience without certainty. Science is all about not being certain (for rationalistic reasons, ironically enough). Also I think that you are thinking that omniscience requires perfect rationality (putting it all together might be knowing how it fits together and attending to such facts). Do we think that a mad man's justified beliefs might not be knowledge? So, I think that you are right: omniscience should add a proviso about perfect rationality, not one about certainty. (After all, one could be certain for no good reason, and that would hardly add anything good to one's knowledge.)

Martin Cooke said...

This very-many-sided die can be used to highlight the difference between belief and high credence, I think. E.g. we are going to the park, or the shops or whatever, and as we set off I roll a die and if it comes up 1 then I will just sit there thinking how amazing that it did. You feel disappointed in me: although there is practically no chance of the 1 coming up (as small a chance as can be) we had agreed to go the park/shops so it is like I have broken a promise; I have not broken a promise, and yet: it is a little like that. There are lots of iffier things about ordinary beliefs, but they are not (or not obviously) of this wrong kind.

Martin Cooke said...

Where does this presence of "certainty" in "omniscience" come from? Is it not from the 'justification' bit of 'knowledge'? If a justified true belief gets more justification, it is better known. But at the end of the day, adding certainty without the right sort of justification adds nothing to knowledge, but may rather undermine it by raising doubts about rationality. And in exceptional circumstances, might perfect justification not give rise, rationally, to something less than certainty? Also, imagine knowing p, and then becoming slightly ill so that you have irrationally got less certainty about p, but not so much less that you stop knowing p: your knowledge is not less, and it is only less good insofar as your previous confidence had been perfectly justified: had you been over-confident, because of your being only human, then your knowledge would actually be better! So, I think that starting by adding total certainty was a mistake, that one should have started by demanding total justification, which would immediately have taken one to a requirement for perfect rationality. (Nice post btw: made me think!)

Alexander R Pruss said...

"You believe that it is very (very) unlikely to land with 1 uppermost, but do you really believe that it will not so land?"

Of course I do. If I don't believe that, I don't believe much at all. I may believe that a student will keep his promise to me to hand a paper in on time. But the chance that they won't is much higher than one in a million, even if they are super-conscientious.

If one multiplies the number of sides, the point becomes even clearer. I believe, and know, that our car is in the garage. But is the chance that my wife decided to play a practical joke on me by parking in it in the backyard really less than one in a googol? Surely not, even though she's not one for practical jokes.

Martin Cooke said...

If I roll a die, I believe that it will probably not land with 1 uppermost, but I do not thereby believe that it will not so land. Some evidence that I do not: my lack of surprise when it does so land.

And how could it make any difference, the number of sides? I would not be surprised if it landed with any number uppermost, for any number of numbers. And so I really would not believe that it would not so land. To so believe seems to me to be irrational.

I find it interesting that you would; and so I still wonder, is there not a very real sense of "belief" in which you would not? And is that not the more appropriate sense when discussing omniscience? There is an ordinary use of "belief" in which "I do not believe that he will" means that you do believe that he will not, which would very clearly be inappropriate.

Martin Cooke said...

Consider the arrangement of things that you see outside, e.g. cars on the road, leaves on the trees and such, every few minutes a new arrangement; almost all of them unsurprising. Each new arrangement is very unlikely, but surely you do not believe of each one, before you see it, that it is not going to be what you see. And surely you do have beliefs about those arrangements (much as you have beliefs about each natural number).

I personally use "belief" to mean just that, something I believe, not just something that I guess. A belief of mine might be false, of course; but if I have a real reason to doubt a particular belief, then I lose that belief (with skeptical scenarios that "real" is not there). I may replace it with a belief that something is very likely. The Preface paradox is a paradox for precisely that reason, imho.

That may just be my idiolect, but it is never the case that I believe that p and that I believe that q and that I do not believe that p and q; and if the meaning of "belief" is as modern philosophers say it is, why did humans get classical logic in the first place? And what do you do about your self-awareness? We can believe that it is likely that p is likely, and yet not believe that p.

Martin Cooke said...

"Sure, you can *know* that the sceptical scenario is false. But that's not enough to know for *certain* that it is false."

If I think that I know something, then I am certain of it. If I have a doubt about it, then I think that I do not know it. (Seeing the number come up 1 is not epistemically akin to knowing that to be unlikely, and surely the seeing is where the knowledge comes from. So if I say that I know, then I mean that I saw, or something of that kind. (Not to mention how certainty might be unjustified!)) Your kind of knowledge is a bit easy isn't it?

Martin Cooke said...

Or consider the following fictional conversation:

Bob: Do you think that there is life on other planets?
Joe: There very probably is.
Bob: Yes. We know there is.
Joe: How do you know there is?
Bob: My conception of 'know' takes me from our thinking that there very probably is life on other planets to my knowing that there is.
Joe: Oh.

And surely our context of discussing omniscience means that we are using the stricter sense of 'know,' if it is implied in such ordinary contexts as the above.

Alexander R Pruss said...

I know that I have two hands. And I know that in the strict sense of "know". But of course the probability that I have two hands is less than one: there is a small probability that I am now dreaming after a hand amputation.

Martin Cooke said...

Although GEMoore might observe that insofar as you know that you have two hands, you know that you are not so dreaming, and so you know that there is no such probability. Surely skeptical scenarios are serious challenges to claims of knowledge because they do directly affect such things, in the absence of a resolution of such paradoxes.

Martin Cooke said...

There are also problems with the very idea of such small probabilities, of course. The possible worlds upon which they are based appear to be infinite, so the probabilities appear to be infinity divided by infinity. But even if they are finite, are they constructed out of what seems to be likely or not in this world? But it is precisely such likelihoods that are thrown into question. Or do they just exist out there? But then how do we know about them? How do you know that it is a small probability? Etc.

Martin Cooke said...

On your reading of "know" (which may well be the mainstream reading) what I said is illogical, but I do think that my reading is (also valid and) better.

E.g. my reading is factive; is yours? Here is why I wonder: you can know that a die will not come up 1, even while it might. So, suppose that it does come up 1. Did you really know that it would not? But then you can know things that are not true! (Or, if you did not really know, then you did not really know; unless you can change the past, etc.)

And it occurs to me that experts in knowledge acquisition (e.g. scientists) do, as they acquire knowledge, tend to make increasingly qualified claims. It can look as though they are losing knowledge, but of course, they are merely losing an ordinary way of talking as it becomes unfit for their elevated purposes. I think that my reading of "know" is what theirs tends towards (and is therefore the one that applies to analyses of "omniscience").

Martin Cooke said...

Do you know of each number on a very-many-sided die that it won't come up? You can survey the faces visually easily enough, thinking that you know, of each, that it won't come up. But one is bound to come up, and of each that you survey you will know for sure that it might be that one. You know that it might but also that it won't?

Each time the die is thrown, a number that you knew would not come up comes up. And you are not surprised! When we talk about the 1 not coming up, we would be surprised if it did. And you are very unlikely to be shown to be wrong to have thought that it would not come up. But you are bound to have been wrong about one of the numbers.

Similarly, every time you look around you at some particular arrangement of ordinary objects, such as cars, their makes and colours, or the leaves on trees, you are seeing something that you claim that you knew you would not see. Or am I misconstruing you?

Martin Cooke said...
This comment has been removed by the author.
Martin Cooke said...

<"You believe that it is very (very) unlikely to land with 1 uppermost, but do you really believe that it will not so land?"

Of course I do. If I don't believe that, I don't believe much at all.>

But we can replace that '1' with any of the numbers on this very-many-sided die,
and all of those faces can be visually surveyed by you, so that you do form such beliefs;
and yet you are not at all surprised when it lands with one of those faces uppermost,
when it lands in a way that you really did believe it would not land.

And so I do not believe you (it is not that I think that you are lying, of course!)

You can believe a lot even without believing so many falsehoods, I think.

Martin Cooke said...

I wonder where the stress on probability comes from, in epistemology. Knowledge is justified true belief, and most of the apparent contradictions come from having two or more different standards of justification in play. You can have doubts raised under one standard that do not affect beliefs under another.

The standards for your car being in your garage are very different to those for a die-roll, but there are similarities. There is a chance that your wife swapped the die for a die that is massively loaded towards 1, as a practical joke, for example, and yet we ignore that when we think of the probabilities for 1 to turn up.

Martin Cooke said...
This comment has been removed by the author.
Martin Cooke said...

But to go back to the beginning, why not just have:
x is omniscient if and only if x knows all truths?
If x also believed falsely that p, then x would know that p was false, and so x would have to be quite irrational to have such a belief, and that irrationality would indicate that x did not really have knowledge.
Consequently one should be able to go from x knowing all truths to x not believing any untruths.