Showing posts with label Sleeping Beauty. Show all posts
Showing posts with label Sleeping Beauty. Show all posts

Thursday, May 23, 2024

A supertasked Sleeping Beauty

One of the unattractive ingredients of the Sleeping Beauty problem is that Beauty gets memory wipes. One might think that normal probabilistic reasoning presupposes no loss of evidence, and weird things happen when evidence is lost. In particular, thirding in Sleeping Beauty is supposed to be a counterexample to Van Fraassen’s reflection principle, that if you know for sure you will have a rational credence of p, you should already have one. But that principle only applies to rational credences, and it has been claimed that forgetting makes one not be rational.

Anyway, it occurred to me that a causal infinitist can manufacture something like a version of Sleeping Beauty with no loss of evidence.

Suppose that:

  • On heads, Beauty is woken up at 8 + 1/n hours for n = 2, 4, 6, ... (i.e., at 8.5 hours or 8:30, at 8.25 hours or 8:15, at 8.66… hours or 8:10, and so on).

  • On tails, Beauty is woken up at 8 + 1/n hours for n = 1, 2, 3, ... (i.e. at 9:00, 8:30, 8:20, 8:15, 8:10, …).

Each time Beauty is woken up, she remembers infinitely many wakeups. There is no forgetting. Intuitively she has twice as many wakeups on tails, which would suggest that the probability of heads is 1/3. If so, we have a counterexample to the reflection principle with no loss of memory.

Alas, though, the “twice as many” intuition is fishy, given that both infinities have the same cardinality. So we’ve traded the forgetting problem for an infinity problem.

Still, there may be a way of avoiding the infinity problem. Suppose a second independent fair coin is tossed. We then proceed as follows:

  • On heads+heads, Beauty is woken up at 8 + 1/n hours for n = 2, 4, 6, ...

  • On heads+tails, Beauty is woken up at 8 + 1/n hours for n = 1, 3, 5, ...

  • On tails+whatever, Beauty is woken up at 8 + 1/n hours for n = 1, 2, 3, ....

Then when Beauty wakes up, she can engage in standard Bayesian reasoning. She can stipulatively rigidly define t1 to be the current time. Then the probability of her waking up at t1 if the first coin is heads is 1/2, and the probability of her waking up at t1 if the first coin is tails is 1. And so by Bayes, it seems her credence in heads should be 1/3.

There is now neither forgetting nor fishy infinity stuff.

That said, one can specify that the reflection principle only applies if one can be sure ahead of time that one will at a specific time have a specific rational credence. I think one can do some further modifying of the above cases to handle that (e.g., one can maybe use time-dilation to set up a case where in one reference frame the wakeups for heads+heads are at different times from the wakeups for heads+tails, but in another frame they are the same).

All that said, the above stories all involve a supertask, so they require causal infinitism, which I reject.

Friday, May 17, 2024

Yet another argument for thirding in Sleeping Beauty?

Suppose that a fair coin has been flipped in my absence. If it’s heads, there is an independent 50% chance that I will be irresistably brainwashed tonight after I go to bed in a way that permanently forces my credence in heads to zero. If it’s tails, there will be no brainwashing. When I wake up tomorrow, there will be a foul taste in my mouth of the brainwashing drugs if and only if I’ve been brainwashed.

So, I wake up tomorrow, find no taste of drugs in my mouth, and I wonder what I should to my credence of heads. The obvious Bayesian approach would be to conditionalize on not being brainwashed, and lower my credence in heads to 1/3.

Next let’s evaluate epistemic policies in terms of a strictly proper scoring accuracy rule (T,F) (i.e., T(p) and F(q) are the epistemic utilities of having credence p when the hypothesis is in fact true or false respectively). Let’s say that the policy is to assign credence p upon observing that I wasn’t brainwashed. My expected epistemic utility is then (1/4)T(p) + (1/4)T(0) + (1/2)F(p). Given any strictly proper scoring rule, this is optimized at p = 1/3. So we get the same advice as before.

So far so good. Now consider a variant where instead of a 50% chance of being brainwashed, I am put in a coma for the rest of my life. I think it shouldn’t matter whether I am brainwashed or put in a coma. Either way, I am no longer an active Bayesian agent with respect to the relevant proposition (namely, whether the coin was heads). So if I find myself awake, I should assign 1/3 to heads.

Next consider a variant where instead of a coma, I’m just kept asleep for all of tomorrow. Thus, on heads, I have a 50% chance of waking up tomorrow, and on tails I am certain to wake up tomorrow. It shouldn’t make a difference whether we’re dealing with a life-long coma or a day of sleep. Again, if I find myself away, I should assign 1/3 to heads.

Now suppose that for the next 1000 days, each day on heads I have a 50% chance of waking up, and on tails I am certain to wake up, and after each day my memory of that day is wiped. Each day is the same as the one day in the previous experiment, so each day I am awake I should assign 1/3 to heads.

But by the Law of Large Numbers, this is basically an extended version of Sleeping Beauty: on heads I will wake up on approximately 500 days and on tails on 1000 days. So I should assign 1/3 to heads in Sleeping Beauty.

Tuesday, March 17, 2015

Decision theory, evidence and Sleeping Beauty

I've been thinking about the following variant of Sleeping Beauty. A coin is tossed on Sunday, out of your sight. As usual, if the coin is heads, you'll wake up on Monday, and then sleep through until Wednesday. If it's tails, you'll wake up on Monday and Tuesday. But this isn't standard Sleeping Beauty. Your memory of the Monday wakeup won't be erased on Tuesday. Instead, you will be given a drug that makes it impossible for you to update your credence as to heads on Tuesday.

You wake up. It's Monday. You know it's Monday, because you don't remember an earlier wakeup. How should you set your credence?

Evidentially speaking, it's clear and uncontroversial. Your credence in heads evidentially should still be 1/2. A Monday wakeup is no evidence for or against heads. (Now, a Tuesday wakeup would a different matter—it is conclusive evidence against heads, but it would be evidence you are unable to update on due to the drug.)

But suppose that both on Monday and, if it's tails, on Tuesday you will be offered choices from a single broad and diversified portfolio of bets regarding whether the coin landed heads. Suppose, further, that you will be unable to decide except on the basis of maximizing expected utility (with respect to your credences). Then decision-theoretically, you should assign credence 1/3. A simple argument is that if the experiment is repeated, then in 1/3 of the times you're choosing from the portfolio it will in fact be heads and 2/3 of the times you're choosing from the portfolio it will in fact be tails (remember that once you set the credence on Monday, it won't be able to change on Tuesday). So you should gamble as if you had credence 1/3, and to do that you need your credence to be 1/3 since I assumed that you cannot but bet on the basis of your credence.

Interestingly, the same result follows if you're maximizing total lifetime expected epistemic utility with respect to a proper scoring rule: You should assign 1/3 to heads.

Yet evidentially your credence should be 1/2. This illustrates the fact that when you expect your future credences to have a chance of being irrational—because of your inability to update on Tuesday—then we have a conflict between what, on the one hand, the evidence supports and what, on the other hand, decision theory and epistemic utility maximization support.

The original Sleeping Beauty case, where you can't tell if it's Monday or Tuesday because your memory has been erased, has some similarity to this. For while in my modified case, Tuesday's credence is forced by a drug to be the same as Monday's, in the original Sleeping Beauty case Tuesday's credence is forced to be the same as Monday's due to memory loss and the fact that, presumably, you will make up your mind in the same way given the same data.

This similarity suggests that we should be suspicious of concluding that evidentially your credence should be 1/3 from the fact that both decision-theoretic and epistemic utility considerations lead to 1/3 in the original Sleeping Beauty case.

I only want to make this modest point. I think that's the only point the analogy supports. The analogy is not strong enough to support the conclusion that one should assign 1/2 in the original Sleeping Beauty case. But it is enough, I think, to show that cases like Sleeping Beauty are going to be exceptions to the correspondence between evidential and utility (whether pragmatic or epistemic) considerations.

Monday, March 16, 2015

Internal time, external time, probability and disagreement

Suppose that Jim lives a normal human life from the year 2000 to the year 2100. Without looking at a clock, what probability should Jim attach to the hypothesis that an even number of minutes has elapsed from the year 2000? Surely, probability 1/2.

Sally, on the other hand, lives a somewhat odd human life from the year 2000 to the year 2066. During every even-numbered minute of her life, her mental and physical functioning is accelerated by a factor of two. She can normally notice this, because the world around her, including the second hands of clocks, seems to slow down by a factor of two. She has won many races by taking advantage of this. An even-numbered external minute subjectively takes two minutes. Suppose that Sally is now in a room where there is nothing in motion other than herself, so she can't tell whether this was a sped-up minute or not. What probability should Sally attach to the hypothesis that an even number of minutes has elapsed from the year 2000?

If we set our probabilities by objective time, then the answer is 1/2, as in Jim's case. But this seems mistaken. If we're going to assign probabilities in cases like this—and that's not clear to me—then I think we should assign 2/3. After all, subjectively speaking, 2/3 of Sally's life occurs during the even-numbered minutes.

There are a number of ways of defending the 2/3 judgment. One way would be to consider relativity theory. We could mimic the Jim-Sally situation by exploiting the twin paradox (granted, the accelerations over a period of a minute would be deadly, so we'd have to suppose that Sally has superpowers), and in that case surely the probabilities that Sally should assign should be looked at from Sally's reference frame.

Another way to defend the judgment would be to imagine a third person, Frank, who lives all the time twice as fast as normal, but during odd-numbered minutes, he is frozen unconscious for half of each second. For Frank, an even numbered minute has 60 seconds' worth of being conscious and moving, while an odd numbered minute has 30 seconds' worth of it, and external reality stutters. If Frank is in a sensory deprivation chamber where he can't tell if external reality is stuttering, then it seems better for him to assign 2/3 to its being an even-numbered minute, since he's unconscious for half of each odd-numbered one. But Frank's case doesn't seem significantly different from Sally's. (Just imagine taking the limit as the unconscious/conscious intervals get shorter and shorter.)

A third way is to think about time travel. Suppose you're on what is subjectively a long trip in a time machine, a trip that's days internal time long. And now you're asked if it's an even-numbered minute by your internal time (the time shown by your wristwatch, but not by the big clock on the time machine console, which shows external years that flick by in internal minutes). It doesn't matter how the time machine moves relative to external time. Maybe it accelerates during every even-numbered minute. Surely this doesn't matter. It's your internal time that matters.

Alright, that's enough arguing for this. So Sally should assign 2/3. But here's a funny thing. Jim and Sally then disagree on how likely it is that it's an even-numbered minute, even though it seems we can set up the case so they have the same relevant evidence as to what time it. There is something paradoxical here.

A couple of responses come to mind:

  • They really have different evidence. In some way yet to be explained, their different prior life experiences are relevant evidence.
  • The thesis that there cannot be rational disagreement in the face of the same evidence is true when restricted to disagreement about objective matters. But what time it is now is not an objective matter. Thus, the A-theory of time is false.
  • There can be rational disagreement in the face of the same evidence.
  • There are no meaningful temporally self-locating probabilities.

Thursday, October 16, 2014

Continuous Sleeping Beauty

A coin is tossed without the result being shown to you. If it's heads, you are put in a sensory deprivation chamber for 61 minutes. If it's tails, you are put in it for 121 minutes. Data from your past sensory deprivation chamber visits shows that after about a minute, you will lose all track of how long you've been in the chamber. So now you find yourself in the chamber, and realize that you've lost track of how long you've been there. What should your credence be that the coin landed heads?

Why is this a Sleeping Beauty case? Well, take the following discretized version. If it's heads, you get woken up 1,001,000 times and if's tails, you get woken up 2,001,000 times. There is no memory wiping, but empirical data from past experiments shows that you completely stop keeping track of wake-up counts after you've been woken up a thousand times. So now you've been woken up, and you know you've stopped counting. What should your credence be? This is clearly a version of Sleeping Beauty, except that instead of memory-wiping we have a cessation of keeping count, which plays the same role of being a non-rational process disturbing normal rational processes.

Oddly, though, in the sensory deprivation chamber case, I have the intuition that you should go for 1/2, even though in the original Sleeping Beauty case I've argued for 1/3. I don't have much intuition about my discretized version of the sensory deprivation chamber case.

P.s. I was thinking of blogging another Sleeping Beauty case, but it looks like LessWrong has beaten me to essentially it. (There may be a published version somewhere, too.)

Thursday, March 20, 2008

Another argument for thirding in Sleeping Beauty

As usual, a fair coin is flipped on Sunday, without you seeing the result, and then you go to sleep.

Experiment 1 (standard Sleeping Beauty):
Tails: You get woken up Monday and Tuesday. Your memory is erased each time, and you don't know whether it's Monday or Tuesday when you wake up.
Heads: You get woken up Monday but not Tuesday.
Question: What should your credence in heads be when you wake up?

Experiment 2:
As soon as you have fallen asleep, a second coin is tossed. If it is heads, "Monday" is written down on a hidden blackboard in the experimenter's office, and if it is tails, "Tuesday" is written down on that board. You never see that board.
Tails: You get woken up Monday and Tuesday. Your memory is erased each time as in Experiment 1.
Heads: You get woken up on the day whose name is written in the experimenter's office, but not on the other day.
Question: What should your credence in the first coin's being heads be when you wake up?

I now claim (i) in Experiment 2, the answer is 1/3 regardless of how biased the second coin is, and (ii) it follows from (i) that the answer is 1/3 in Experiment 1.

Claim (ii) is intuitively clear. It shouldn't matter whether the heads wakeup day is Monday or Tuesday.

The harder to argue for claim is (i). Here goes. I am now awake. I give a new rigidly-designating name to today. Maybe the way I do it is I pick a bunch of letters at random to form the name (I neglect the probability that on multiple wakeups I'll choose the same name). So, let's say I have named this day "Xhfure". Let A be the following event: The name of the day written on the experimenter's blackboard refers to Xhfure. Note that A is a contingent event and has prior probability 1/2. Let H and T be the events of the first coin being heads or tails respectively. What is the most specific evidence I now have? I submit it is the following: H or (T and A). Let this evidence be E.

So, now I ask: What is P(H|E)? This is an easy calculation. P(H and E) = P(H) = 1/2. P(E) = P(H) + P(T)P(A) = (1/2) + (1/2)(1/2) = 3/4. Thus, P(H|E) = (1/2)/(3/4) = 1/3.

Monday, February 11, 2008

An approach to the Sleeping Beauty problem

Experiment 1:
A fair coin is flipped on Sunday, without you seeing the result, and then you go to sleep.
Tails: You get woken up Monday and Tuesday and each time shown a red flag. Your memory is erased each time, and you don't know whether it's Monday or Tuesday when you wake up.
Heads: You get woken up Monday and shown a red flag.
So, you're awake and see a red flag. What probability should you assign to heads? The two most common options are 1/2 and 1/3.

Experiment 1 is equivalent to the Sleeping Beauty problem, with a red flag added, which changes nothing.

Experiment 2:
Same as Experiment 1, except that on heads, you get woken up on both Monday and Tuesday, but on Tuesday you see a white flag.

It seems clear to me that in Experiment 2 you should assign 1/3 to heads. It's a standard Bayesian thing—you start with 1/2, and update on the additional information of a red flag: given heads, the chance of a red flag is 1/2, and given tails, the chance of a red flag is 1, so the red flag is evidence for tails, and the numbers work out to a probability 1/3 for heads.

I now claim that you should give the same answer for both experiments. This implies you should assign 1/3 to heads on Experiment 1.

Two arguments for equivalence. First, we can imagine a continuum of cases between Experiments 1 and 2, where the amount of awareness you have on Tuesday given heads varies continuously from zero (Experiment 2) to full human awareness (Experiment 1). Where exactly you would be on this spectrum on a white flag Tuesday does not seem to me to affect what credence is rational when you see a red flag and are fully humanly conscious.

Second argument. You can imagine that you're not told ahead of time whether you're in Experiment 1 or 2, but when you wake up and see a red flag you can press a button that you know has the following effect. If the coin was tails, the button has no effect. If the coin was heads, in which case it's Monday (since on Tuesday you don't see a red flag, if you're awake at all), and you press the button, then you'll wake up on Tuesday and see a white flag; if you don't press the button, you won't wake up on Tuesday in that case.

If you press the button, you're effectively in Experiment 2. If you don't, you're effectively in Experiment 1. If different credences of tails are appropriate in the two experiments, then how you decide about the button press after the coin toss affects what credence you should assign to tails. That's weird. (Is there a better choice of button press, one that will provide me with a better credence?)

Variant on second argument: You either do or do not see an independent random process depress the button when you wake up—you don't control the button. Should the outcome of this random process affect your credence about the initial coin toss? Certainly not: the outcome of this process is ex hypothesi independent of everything else. So, you should assign the same credence to tails in Experiments 1 and 2, and this credence should be tails.

I've been told that the claim that 1/3 is the right answer for Experiment 2 would be controversial. If so, then the argument only shows the credence is the same in the two cases, not that it's 1/3. But I think 1/3 is the right answer for Experiment 2.