Here's a decision theoretic picture of how to make the decision between A and B. First, gain as much knowledge K as is reasonably possible about the laws and present conditions in the universe. The more information, the better our decision is likely to be (cf. Good's Theorem). Then calculate the conditional expected utility of the future given A with K, and do the same for B. Then do the action where the conditional expected utility is higher.
Let U(A,K) and U(B,K) be the two conditional expected utilities. (Note: I mean this to be neutral between causal and epistemic decision theories, but if I have to commit to one, it'll be causal.) We want to make our decision on U(A,K) and U(B,K) for the most inclusive K we can.
Now imagine that we could ask an angel for any piece of information I about the present and the laws (e.g., by asking "How many hairs do I have on my head?"), and then form a new set of information K2 including I on which to calculate U(A,K2) and U(B,K2). Then we should ask for as much information as we can. But now here is a problem: if determinism holds, then once we get enough information, Kn will entail which of A and B happens. Let's say it entails A. Then U(B,Kn) is undefined. This informs one that one will do A, but makes decision-making impossible.
So how much cost-free information should we get from the angel? If we ask for so much that it entails what we're going to do, we won't be able to decide. If our choice is indeterministic, we have a simple principled answer: Ask for everything about the laws and the present. But if our choice is determined, we must stop short of full information. But where?
Perhaps we ask for full information about the laws and about everything outside our minds. But the contents of our minds are often highly relevant to our decisions. For instance, if we leave out in our decision-making the content of our minds, we won't have information on what we like and what we dislike. And in some decisions, such as when deciding whether to see a psychologist, information about our character is crucial.
Here's another interesting question. Our angel knows all about the present and the laws. It seems that he's got all the information we want to have about how we should act. So we just ask: Given all you know, does A or does B maximize utility? And he can't answer this question. For given all that he knows, only one of the two conditional utility values makes sense.
Of course, a similar problem comes up in asking an omniscient being in a case where our choices are indeterministic. We might think that we can make a better decision if that being tells about the future. ("What will AAPL close at tomorrow?") But there is a bright line that can be drawn. We cannot use in our decision any information that depends on things that depend on our decision, since then we have a vicious loop in the order of explanation. So an omniscient being metaphysically cannot give us information that essentially depends on our decisions. (In particular, if we're deciding whether to buy AAPL stock, he can't tell us what it will close at tomorrow, unless he has a commitment to make it close at that no matter what we do, since without such a commitment, what it will close at tomorrow depends—in a complex and stochastic and perhaps chaotic way—on whether we buy the stock today.)
Let me end with this curious question:
- If you have a character that determines you never to ask for help, isn't that a reason to get professional help?