## Monday, April 5, 2021

### Best estimates and credences

Some people think that expected utilities determine credences and some thing that credences determine expected utilities. I think neither is the case, and want to sketch a bit of a third view.

Let’s say that I observe people playing a slot machine. After each game, I make a tickmark on a piece of paper, and if they win, I add the amount of the win to a subtotal on a calculator. After a couple of hours—oddly not having been tossed out by the casino—I divide the subtotal by the number of tickmarks and get the average payout. If I now get an offer to play the slot machine for a certain price, I will use the average payout as an expected utility and see if that expected utility exceeds the price (in a normal casino, it won’t). So, I have an expected utility or prevision. But I don’t have enough credences to determine that expected utility: for every possible payout, I would need a credence in getting that payout, but I simply haven’t kept track of any data other than the sum total of payouts and the number of games. So, here the expected utility is not determined by the credences.

The opposite is also not true: expected utilities do not determine credences.

Now consider another phenomenon. Suppose I step on an analog scale, and it returns a number w1 for my weight. If that’s all the data I have, then w1 is my best estimate for the weight. What does that mean? It certainly does not mean that I believe that my weight is exactly w1. It also does not mean that I believe that my weight is close to w1—for although I do believe that my weight is close to w1, I also believe it is close to w1 + 0.1 lb. If I were an ideal epistemic agent, then for every one of the infinitely many possible intervals of weight, I would have a credence that my weight lies in that interval, and my best estimate would be an integral of the weight function over the probability space with respect to my credence measure. But I am not an ideal epistemic agent. I don’t actually have much of a credence for the hypothesis that my weight lies between w1 − 0.2 lb and w1 + 0.1 lb, say. But I do have a best estimate.

This is very much what happened in the slot machine case. So expected values are not the only probabilistic entity not determined by our credences. Rather, they are a special case of best estimates. The expected utility of the slot machine game is simply my best estimate at the actual utility of the slot machine game.

We form and use lots of such best estimates.

Note that the best estimate need not even be a possible value for the thing we are estimating. My best estimate payoff for the slot-machine given my data might be \$0.94, even though I might know that in fact all actual payouts are multiples of a dollar.

With this in mind, we can take credences to be nothing else than best estimates at the truth value, where we think of truth value as either 0 (false) or 1 (true). (Here, I think of the fact that the standard Polish word for probability is “prawdopodobieĊstwo”—truthlikeness, verisimilitude.) Just as in the case above, when my best estimate for the truth is 0.75, I do not think the actual truth value is 0.75: I like classical logic, and think the only two possible values are 0 and 1.

Here, then, is a picture of what one might call our probabilistic representation of the world. We have lots of best estimates. Some of these are best estimates of utilities. Some are best estimates of other quantities, such as weights, lengths, cardinalities, etc. Some are best estimates of truth values. A consistent agent is one such that there exists a probability function such that all of the agent’s best estimates are mathematical expetations of the corresponding values with respect to the probability function. In particular, this probability function would extend the agent’s credences, i.e., the agent’s best estimates for truth values.

On this picture, there is no privileging between expected utilities, credences or other best estimates. It’s just estimates all around.