## Tuesday, October 25, 2022

### Learning from what you know to be false

Here’s an odd phenomenon. Someone tells you something. You know it’s false, but their telling it to you raises the probability of it.

For instance, suppose at the beginning of a science class you are teachingabout your studnts about significant figures, and you ask a student to tell you the mass of a textbook in kilograms. They put it on a scale calibrated in pounds, look up on the internet that a pound is exactly 0.45359237 kg, and report that the mass of the object is 1.496854821 kg.

Now, you know that the classroom scale is not accurate to ten significant figures. The chance that the student’s measurement was right to ten significant figures is tiny. You know that the student’s statement is wrong, assuming that it is in fact wrong.

Nonetheless, even though you know the statement is wrong, it raises the probability that the textbook’s mass is 1.496854821 kg (to ten significant figures). For while most of the digits are garbage, the first couple are likely close. Before you you heard the student’s statement, you might have estimated the mass as somewhere between one and two kilograms. Now you estimate it as between 1.45 and 1.55 kg, say. That raises the probability that in fact, up to ten significant figures, the mass is 1.496854821 kg by about a factor of ten.

So, you know that what the student says is false, but your credence in the content has just gone up by a factor of ten.

Of course, some people will want to turn this story into an argument that you don’t know that the student’s statement is wrong. My preference is just to make this statement another example of why knowledge is an unhelpful category.

Alexander R Pruss said...

On reflection, the phenomenon in the first sentence of the post isn't odd at all. Typically if someone tells you something, that is evidence for what they tell you, even if you know it's not true.

Walter Van den Acker said...

Alex

If the first digits are likely close, what the student says is not completely wrong. It is inaccurate.
The reason you learn something from it is because you already have knowledge about the book's probable weight. Suppose you ask me about the distance between the earth and the moon and I say it's 400 000 km. That's wrong, but it is a better answer than, say, 4 million km. Suppose that before you asked me, you estimated the distance as between 100 000 and 1 million, then now you estimate it between 300 000 and 500 000.
The reason you can estimate it is becasue you have a certain confindence in my claims. Even though you know I cannot have measured the distance accurately enough to know it's 400 000 km, you are still confident that I at least know something about it.
Now compare this to a situation in which I simply tell you a random distance. The only thing you learn from this is that I am not capable of or willing to make a genuine effort.

Alexander R Pruss said...

Even if you know that I am telling you a random distance, you still learn from it. For your knowledge of such facts as that I am telling you a random distance is never certain. It may look like I'm just making it up at random, but there is a chance that my statement is guided by the truth. (There is also a chance that my statement is guided by falsehood. But I think that, absent special evidence about me having reason to positively deceive you, that chance is smaller.) Or I may be filtering particularly ridiculous random answers (e.g., you ask me how many miles it is from Waco to Los Angeles, and I google "random number", and get 4; but that's too ridiculous, so I just say 100).

Walter Van den Acker said...

Alex

Suppose you want to go to a place P. There is only one road that leads to P, namely the road to the right.
Now you ask me which way you should go and I tell you that the left road is the correct one, which is the wrong way.
What do you learn from this?

Alexander R Pruss said...

My claim was only that typically you learn something in favor of p by being told that p is true. There are, of course, exceptions (e.g., if you know that someone is going to be lying, in which case their saying something is evidence that they disbelieve it, which in turn is evidence against it).

That said, that an intelligent person believes a clear and explicit contradiction may be some very slight evidence against the law of noncontradiction.