The ideal rational agent, it seems, would respond to evidence instantly. After all, you should believe in accordance with your evidence, so when you have new evidence your beliefs should not fall behind. This means that the ideal rational agent's degrees of belief will oscillate a fair amount, since we constantly get bits of evidence in various directions.
But humans do not operate in this way. Our degrees of belief tend to update fairly slowly, and tend to be free of small oscillations this way and that way. Moreover, while our commonsense evaluations criticize the person whose beliefs lag behind her evidence too much as being closed-minded, we do not praise the person whose beliefs constantly oscillate in sync with the changing body of evidence. On the contrary, we are apt to think him fickle and unsteady.
Now I don't want to overstate the above point. Well-confirmed beliefs do tend to stay well-confirmed when new evidence comes even in an ideal rational agent, and there is a granularity in our beliefs that doesn't allow us to distinguish between, say, 99.999% confidence and 99.997% confidence. But nonetheless, it seems to me that our attitudes do not favor the instantaneously evidence-responsive and hence sometimes rapidly oscillating degrees of belief found in the ideal rational agent. It's as if we expected people to pass their evidential support through a moving average low-pass filter.
Are we simply mistaken in wanting greater belief steadiness from people? Shouldn't people respond instantly to evidence?
Yes and no. While I think we should apportion our beliefs to the evidence (with evidence very broadly construed), probably we only really count as having a piece of evidence when we've evaluated its impact. It takes time to do that. Moreover, because of limited resources, we will evaluate several pieces of evidence at a time in connection with a given question: we just do not have the leisure to evaluate each piece of data exactly when it first becomes possible to do so. But if we evaluate several pieces of evidence together, then this produces an effect very much like that of a moving average. Of course, once the evidence has been evaluated, belief should follow instantly. However, when someone's beliefs oscillate too much, that is a sign that either he is a really quick thinker—and few are like that—or that he is failing to evaluate evidence with the carefulness that is called for.
Further, while belief should instantly follow the evaluation of evidence, the behavior of a rational agent may not look like it follows. There is a cost to switching our behavior to a new track, which can make it rational to keep our old behavior—maybe only temporarily, until we are able to find a way to switch at lower cost—even if it wasn't the behavior which we would have adopted had we had our new beliefs. If I have done form A of exercise over the years, and the latest research says that B is healthier, then it can still make sense to do A because of the costs of buying new equipment, acquiring new habits, etc. So people's behavior will, and quite rationally so, seem to lag behind the changing evidence, and will appear to smooth out oscillations in the evidence. I say "seem" and "appear", because in fact the continuation of the old behavior may be quite in keeping with the new beliefs once the switching costs are considered. Thus, the fickle agent whose behavior changes too much with incoming evidence is rightly to be criticized for not paying enough attention to switching costs.
Of course, there are times when behavior needs to switch much faster. If I get evidence that I have been acting unjustly, then switching costs become irrelevant: I must refrain from injustice no matter the cost. And there is, all other things being equal, a value to being an agile agent, one able to change behaviors quickly without much cost when new evidence comes. But only all other things being equal: there are values in habits that can override the value of agential agility.
No comments:
Post a Comment