Friday, October 6, 2023

Complexity and skeptical hypotheses

Suppose a strong epistemic preference for simpler theories of the world. One might then think that a simulation hypothesis is automatically more complex than the best physical story about our world, because in addition to all the complexity of our simulated cosmos, it includes the complexity of whatever physical cosmos houses the hardware running the simulation.

But this need not be the case. The best physical story about our world makes our world include vast amounts of information that would not need to be included in the simulation. To simulate the history of the human race, we at most need information on the particles wihtin a sphere of radius about a hundred thousand light-years, so basically just the Milky Way Galaxy, a very small fraction of the particles in the world. And even that is a vast overstatement. One can surely have a low simulation resolution for a lot of stuff, simulating things only on a macroscopic level, and only including particle-level information when the simulated humans peer into scientific instruments. So the information content of the simulation software could be much, much lower than the information content of the physical world that our best theories say we live in.

But what about the simulation hardware itself? Wouldn’t that need to live in a complex physical universe? Maybe, but that universe need not be as complex as our physical theories claim ours to be. It could be a universe that has a level of physical complexity optimized for running the computing hardware. The granularity of that universe could be much coarser than ours. For instance, instead of that universe being made of tiny subatomic particles like ours, requiring many (but fewer and fewer with progress in miniaturization) particles per logic gate, we could suppose a universe optimized for computing whose fundamental building blocks are logic gates, memory cells, etc.

I am dubious, thus, whether we can rule out simulation hypotheses by an epistemic preference for simpler theories. The same goes for Berkeleian skeptical hypotheses on which there is no physical world, but we are disembodied minds being presented with qualia.

And of course the “local five minute hypothesis”, on which the universe is five minutes old and has a radius of five light-minutes, posits a world with intuitively much less complexity than the world of our best theories, a world with vastly fewer particles.

But if we cannot avoid skeptical hypotheses on grounds of complexity, how can we avoid them?

My current view is that we simply have to suppose that our priors are normatively constrained by human nature (which on my Aristotelian view is a form, a real entity), and human nature requires us to have non-skeptical priors. This is a very anthropocentric account.

5 comments:

  1. Here's an idea:
    S1) The more fundamental a thing is, the more we should value its simplicity.

    I'm not sure how to spell out (S1) exactly, but let's apply the gist of (S1) to these two hypotheses:

    H1) We have a necessary Being that creates an old universe.
    H2) We have a (deceiver) necessary Being that creates a five minute universe.
    Sure, the universe in (H2) is simpler than the universe in (H1). But, plausibly, the necessary Being in (H1) is simpler than the necessary Being in (H2), because the latter has more complex intentions. And necessary Beings are more fundamental than universes, so we should prefer (H1) to (H2).

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. I should add that I should have just said "God" instead of "Necessary Being" because I then immediately ascribe intentions to Him.

    ReplyDelete
  4. This post seems to me to be using a rather one-dimensional concept of simplicity vs complexity. Should the complexity of a universe be judged by the number of particles and the information it would take to describe all their positions etc.? Given that Occam's Razor is about minimising causal factors more than it is about number of objects, might it not be better to look at the complexity of the "mechanism" that produces those particle distributions? Even a simple set of mathematical laws can produce great complexity, albeit often repetitive or recursive, e.g., the Mandelbrot Set, or stochastic. To give a silly example, a regular cubic lattice with a "googleplex" particles would still be simpler than our observable universe, despite its much greater number of particles. Even if one massively increased the information required to describe it by adding small thermal random vibrations to the whole structure, it would still be simpler in the relevant sense.

    The minimal diameter universe posited (based on light-cones I assume) as "simpler" than the one we generally infer, but still able to fully account for human experience and history, is not simpler at all, I would argue. It lacks coherence as it has to account for the other millions of galaxies we appear to observe as uncannily like our own (e.g., Andromeda) by assuming light rays are creating those images without being sourced in real entities. These rays are effectively elaborate illusions with unknown cause/s. Such an explanation savours more of the epicyles of modified Ptolemaic theory proposed to explain retrograde motion of the planets, but far worse.

    ReplyDelete
  5. "These rays are effectively elaborate illusions with unknown cause/s."

    Yes, there is a weird coincidence here: Why are all these rays coordinate like they are?

    But that coincidence is nothing as compared to the vast cosmic coincidence of the low entropy initial state of the universe on a standard Big Bang theory.

    ReplyDelete