Are we inhabitants of a computational construct rather than a mind-independent, “base” reality? The simulation question forces a confrontation with first principles: What counts as evidence? What is a physical law? What is a mind? For two decades, the debate has crystallized around Nick Bostrom’s philosophical Simulation Argument and, more recently, around Melvin Vopson’s attempts to recast physical regularities as consequences of information dynamics. Taken together, these projects invite a neutral but persistent scrutiny: if the world were a program, what—if anything—should look different? And if nothing would, is the thesis explanatory, scientific, or metaphysical?
Framing the Hypothesis: Philosophical vs. Physical Claims
The simulation hypothesis is often presented in two registers. The first is philosophical and concerns probability and reference classes: given assumptions about future civilizations and computational power, how likely is it that beings with experiences like ours are simulated? The second is physical and concerns the structure of natural laws: if information is fundamental, could forces, symmetries, or thermodynamic trends emerge from computation-like optimization?
Both registers sharpen the problem but also expose it to different critiques. Philosophically, the key vulnerabilities are the assumptions smuggled into the probability calculus and the choice of observer class. Physically, the central concerns are testability, underdetermination, and the danger of re-describing familiar physics in computational metaphors without gaining predictive power.
Bostrom’s Simulation Argument: A Trilemma, Not a Verdict
Bostrom’s contribution is frequently misread as an assertion that we are simulated. In fact, it is a trilemma: either (1) almost no civilizations reach “posthuman” status, or (2) almost no posthuman civilizations run significant numbers of ancestor simulations, or (3) almost certainly we are in a simulation. The power of the argument is to make complacent realism epistemically uncomfortable: once you grant substrate-independent consciousness and feasible, large-scale emulations, the “reference class” of observers like us becomes dominated by simulated observers.
The pressure points are well known but deserve emphasis:
- Reference class problem. The probability force of the argument depends on which observers count as “like us.” If the class is defined by phenomenology (having experiences like ours), simulants may dominate. If defined by causal origin (biologically evolved primates), non-simulants dominate. There is no non-question-begging way to choose the class without further theory.
- Agnostic premises. The two premises that do real work—substrate-independent minds and feasible emulation—are contestable. Emulation might demand not only astronomical computation, but also high-fidelity modeling of decohering quantum systems and embodied ecological couplings, pushing feasibility beyond hand-wavy estimates.
- Decision-theoretic awkwardness. If the trilemma’s third horn were true, how should we act? Bostrom’s pragmatic view—“carry on”—is sensible, but it highlights an asymmetry: a thesis that cannot rationally guide action or discriminate predictions risks becoming an elegant curiosity.
Read charitably, the argument’s achievement is to expand the space of serious possibilities without claiming evidential closure. It functions best as a skeptical pressure test on our background assumptions about technology, consciousness, and typicality.
Vopson’s Infodynamics: From Metaphor to Mechanism
Where the trilemma works in abstract space, Vopson aims at mechanism. He proposes that information dynamics obey a “second law” distinct from thermodynamic entropy: in closed informational systems, information entropy tends to decrease or remain constant, driving compression and optimization. He then sketches how such a principle might illuminate patterns across domains—genetic evolution, mathematical symmetry, and even gravity—by treating the world as an information-processing system seeking representational economy.
This is a bold shift from metaphor (“the universe is like a computer”) to operational hypothesis (“physical regularities arise from compression pressure”). Several claims stand out:
- Compression as a unifying tendency. If systems evolve toward minimal descriptive complexity, we should observe convergences on symmetry, regularity, and efficient codes. That would make “lawfulness” not a brute fact, but an emergent byproduct of informational housekeeping.
- Discrete “cells” of space-time. By modeling reality as a lattice of information-bearing units, one can derive dynamics where bringing matter together reduces the number of required state descriptors—yielding attractive behavior we label gravity.
- Mass–energy–information linkage. If information is physical, it may carry energetic or mass-like attributes, potentially reframing puzzles such as dark matter in informational terms and motivating laboratory tests involving information “deletion.”
The attraction of this program is clear: it promises testable bridges between information theory and fundamental physics. Yet here the standards are necessarily high. Re-describing known regularities in the language of compression is not enough; what matters is novel, discriminating prediction. Does infodynamics forecast a quantitative anomaly that standard models do not? Can it retrodict established constants without free parameters? Can its “lattice” commitments be falsified by precision measurements that would look different if reality were continuous?
What Would Count as Evidence?
A mature evaluation requires clarifying what would make the simulation hypothesis—or its infodynamic avatar—evidentially vulnerable. Several routes are often discussed:
- Lattice artifacts. If space-time were discretized on a computational grid, extraordinarily high-energy processes (e.g., cosmic rays) might reveal subtle anisotropies or dispersion relations aligned with the grid’s axes. Absence of such signatures places lower bounds on any putative lattice scale.
- Complexity ceilings. A finite simulator might impose resource-driven limits—on quantum entanglement depth, for instance, or on the complexity of interference patterns. Experiments could hunt for unexpected saturation points not predicted by standard theory.
- Thermodynamic asymmetries. If an informational second law diverges from thermodynamic entropy, carefully constructed “closed” information systems might exhibit directionality (toward compression) that resists reduction to conventional statistical mechanics.
- Energetic cost of information erasure. Landauer’s principle already ties information erasure to heat dissipation. Stronger, non-redundant links—e.g., mass deficits tied to information deletion—would be decisive if observed cleanly, disentangled from ordinary dissipation.
Each avenue faces familiar obstacles: measurement precision, background effects, and, crucially, underdetermination. A signal compatible with simulation may also be compatible with non-simulation theories (quantum-gravity proposals, emergent spacetime models, or novel condensed-matter analogies). The danger is confirmation drift: seeing computation-friendly patterns where multiple frameworks predict similar phenomena.
Methodological Cautions: When Analogies Overperform
Three methodological cautions temper exuberant conclusions:
- The highest-tech metaphor problem. Cultures analogize the cosmos to their best machines—clocks, engines, now computers. Such metaphors can be heuristically fruitful but risk category mistakes if promoted to ontology without adjudicating power against rivals.
- Explanatory bookkeeping. Recasting “gravity” as “information compression” must not merely rename the explanandum. Mechanistic depth requires showing how the new description reduces free parameters, unifies disparate phenomena, or resolves anomalies without ad hoc scaffolding.
- Bayesian accounting. Priors matter. If one assigns low prior probability to substrate-independent consciousness or to feasible ancestor-scale emulations, the posterior that “we are simulated” remains low even under Bostrom-style likelihoods. Conversely, very broad priors can wash out evidential discipline.
Ethical and Existential Spillovers (Whatever the Ontology)
One reason the simulation hypothesis captivates is that it reframes familiar ethical terrain:
- Design ethics. If future beings can instantiate conscious lives in software, then our present choices about AI, virtual agents, and mass emulations acquire moral weight. The simulation question thus boomerangs into policy: should we ever create worlds populated by minds capable of suffering?
- Meaning without metaphysical guarantees. Even if reality were computed, human projects—care, knowledge, art—do not evaporate. Value supervenes on experience and relationship, not on substrate. The practical stance is thus robust across ontologies.
- Epistemic humility. The hypothesis is a salutary reminder that our models may be local compressions of a deeper order. That humility fuels better science whether or not the universe runs on silicon-like primitives.
A Neutral Appraisal
Where does this leave a conscientious, academic observer?
- Bostrom’s trilemma remains a powerful challenge to naive realism, but its bite depends on contestable premises and on choices about observer classes that are philosophically underdetermined.
- Vopson’s program is promising as a research agenda precisely to the extent it yields crisp, risky predictions that standard physics does not. Its long-term value will be measured not by rhetorical resonance but by explanatory economy and empirical traction.
- The simulation hypothesis, as a scientific claim, earns credibility only when it pays rent in predictions. As a philosophical pressure test, it already pays rent by disciplining our assumptions about typicality, embodiment, and mind.
The intellectually honest posture is, therefore, neither credulity nor dismissal but continued critical curiosity. If future work derives quantitative signatures—lattice-direction anisotropies with specific scaling, information-linked mass-energy effects beyond Landauer limits, or complexity ceilings inexplicable within standard theory—then the balance of reasons will shift. Absent that, the simulation thesis remains a live metaphysical option and a fertile heuristic, not yet an empirically preferred hypothesis.
Conclusion: The Value of the Question
Asking whether we are a simulation is not merely a game of speculative ontology. It is a lever that pries open several joints of inquiry: how minds arise, why laws are simple, what information is. Bostrom teaches us to track our assumptions about the distribution of observers; Vopson challenges us to cash “information is physical” into mechanisms that risk being wrong. The safest prediction is that, independent of the hypothesis’s ultimate truth, the methods developed along the way—sharper reference-class reasoning, tighter links between information and dynamics, more discriminating experiments—will enrich our understanding of the world we inhabit, simulated or not.
Until a decisive test discriminates “base” from “emulated” reality, we should refuse both complacent certainty and performative skepticism. Instead, we can let the question do its best work: refine our standards of evidence, clarify our explanatory ambitions, and expand the frontier where physics, computation, and philosophy meet. If the curtain can be pulled back, it will be pulled back by those virtues—not by slogans, but by results.
Sources
- Bostrom, Nick. “Are You Living in a Computer Simulation?” The Philosophical Quarterly 53, no. 211 (2003): 243–255.
- Eggleston, Brian. “A Review of Bostrom’s Simulation Argument.” Stanford University (symbsys205 course material), summary of Bostrom’s probabilistic reasoning.
- Vopson, Melvin M. “The Second Law of Infodynamics and its Implications for the Simulation Hypothesis.” AIP Advances 13, no. 10 (2023): 105206.
- Vopson, Melvin M. “Gravity Emerging from Information Compression” (AIP Advances, 2025) and associated University of Portsmouth communications.
- Orf, Darren. “A Scientist Says He Has the Evidence That We Live in a Simulation.” Popular Mechanics, April 3, 2025.
- Tangermann, Victor. “Physicist Says He’s Identified a Clue That We’re Living in a Computer Simulation.” Futurism, May 3, 2023.
- IFLScience staff. “Physicist Studying SARS-CoV-2 Virus Believes He Has Found Hints We Are Living In A Simulation.” October 2023.
- Vopson, Melvin M. Reality Reloaded: How Information Physics Could Explain Our Universe. 2023.
- Classical background for philosophical skepticism: Plato’s “Allegory of the Cave”; René Descartes, Meditations on First Philosophy (for historical framing).
