Friday, September 14, 2018

The value of knowledge

Here’s a curious phenomenon. Suppose I have enough justification for p that if p is in fact true, then I know p, but suppose also that my credence for p is less than 1.

Now consider some proposition q that is statistically independent of p and unlikely to be true. Finally consider the conjunctive proposition r that p is true and q is false.

If I were to learn for sure that r is true, I would gain credence for p, but it wouldn’t change whether I know whether p is true.

If I were to learn for sure that r is false, my credence for p would go down. How much it would go down depends on how unlikely q is. Fact: If P(q)=(2P(p)−1)/P(p), where P is the prior probability, then if I learn that r is false, my credence for p goes to 1/2.

OK, so here’s where we are. For just about any proposition p that I justifiedly take myself to know, but that I assign a credence less than 1 to, I can find a proposition r with the property that learning that r is true increases my credence in p and that learning that r is false lowers my credence in p to 1/2.

So what? Well, suppose that the only thing I value epistemically is knowing whether p is true. Then if I am in the above-described position, and if someone offers to tell me whether r is true, I should refuse to listen. Here is why. Either p is true or it is not true. If p is true, then my belief in p is knowledge. In that case, I gain nothing by learning that r is true. But learning that r is false would lose my knowledge, by reducing my credence in p to 1/2. Suppose p is false. Then my belief in p isn’t knowledge. In the above setup, if p is false, so is r. Learning that r is false, however, doesn’t give me knowledge whether p is true. It gives me credence 1/2, which is neither good enough to know p to be true nor good enough to know p to be false. So if p is false, I gain nothing knowledge-wise.

So, if all I care about epistemically is knowing the truth about some matter, sometimes I should refuse relevant information on the basis of epistemic goals (Lara Buchak argues in her work on faith that sometimes I should refuse relevant information on the basis of non-epistemic goals; that’s a different matter).

I think this is not a very good conclusion. I shouldn’t refuse relevant information on the basis of epistemic goals. Consequently, by the above argument, knowing the truth about some matter shouldn’t be my sole epistemic goal.

Indeed, it should also be my goal to avoid thinking I know something that is in fact false. If I add that to my goals, the conclusion that I should refuse to listen to whether r is true disappears. For if p is false, although learning that r is false wouldn’t give me knowledge whether p is true, in that case it would take away the illusion of knowledge. And that would be valuable.

Nothing deep in the conclusions here. Just a really roundabout argument for the Socratic thesis that it’s bad to think you know when you don’t.

No comments: