Consider the following BASIC computer program:
input aI run the program, and it pauses for input. I press "5" and "enter". Then it pauses for input again. I press "7" and "enter". It then displays "12".
Intuitively, the computer has answered the question: "What is 5+7?" But that's projection. From the marks on the screen and one's memory of the input, one can deduce that 5+7=12. But the computer program can be interpreted in a variety of ways. We could, for instance, take the program to answer a different question, the question of what an inscription of the decimal number equal to 5+7 looks like. In this case, the program's answer shouldn't be interpreted as a number, but as a numeral. Or we could take the program to answer the question of what the result of converting 5 and 7 (or maybe the ASCII codes 53 and 55) to binary, then adding the results together, and then converting back to decimal would be. And so on. There is an infinity of questions we could take this program to be answering.
All of these questions are different, and which one we take the computer to be answering is completely up to us. For, in fact, it is not the case that the computer is answering a question. Rather, we are using the computer to find an answer to some question or other, and there is an infinite number of questions we could be using the program to find an answer to. (We could be using the program to find an answer to the question whether the "enter" key works or not!)
It seems very implausible to suppose that there is any way of adding complexity to this program in such a way that it will become determinate which question the program is answering. Therefore computers simply cannot answer determinate questions.