2001
We show that humans must be formal axiomatic systems if only four assumptions hold. From this, we proof that simulated humans must instantiate subjective experience, and time is not a fundamental property of the universe but an emergent, internal property of the observer. We derive that the computer and the simulated observer are not causally related, but merely different arrangements of information. From these we show that the universe is fundamentally abstract and substrate-independent, emerging from information rather than from a physical substrate. The resulting claims are falsifiable.
The argument begins with four axioms that serve as the premises for the entire derivation.
DNA is the blueprint of a conscious, pain-sensitive human. \[A_1: D \rightarrow H \rightarrow S\] (Where \(D\) is the DNA/Genome, \(H\) is the human organism, and \(S\) is subjective experience.)
DNA, the human organism, and the surrounding environment are composed solely of ordinary physical matter governed by physical laws (\(P\)). \[A_2: (H, U) \in \text{Axiomatic System}(P, D)\] (Where \(U\) denotes the physical environment required to support \(H\), and \(P\) denotes the governing physical laws.)
Any physically realizable process that can be described mathematically can, in principle, be simulated by a computational system.
A human with subjective experience (\(S\), e.g., pain) behaves differently than the same human without it. \[H + S \neq H\] If a simulation \(H'\) is identical to \(H\) in behavior, it must also include \(S\).
Consider a DNA simulation running on a computer, denoted by \(T_{\text{alg}}\), consisting of code (the laws of physics) and data (the state of the simulated universe). One can gradually optimize the code by introducing lookup tables, eventually replacing all computation with a static, precomputed dataset, denoted by \(T_{\text{data}}\).
Let \(E_{\text{int}}\) be Alice’s experience of time and pain (the internal state transitions).
Let \(E_{\text{ext}}\) be the computer’s external runtime (number of CPU cycles).
Premise: Code optimization changes \(E_{\text{ext}}\) but preserves \(E_{\text{int}}\). \[\text{Optimization}(T_{\text{alg}}) \rightarrow T_{\text{data}} \implies E_{\text{int}}(T_{\text{alg}}) \equiv E_{\text{int}}(T_{\text{data}})\]
In this limit, the external runtime \(E_{\text{ext}}\) becomes zero in the sense that no state transitions are executed; the complete execution trace exists as static data. One can therefore ask: does Alice’s consciousness still persist in \(T_{\text{data}}\)?
If consciousness were to cease in \(T_{\text{data}}\), it would necessitates a minimum code/data ratio for subjective experience. This minimum ratio would be a new, non-physical constant imposed on \(A_2\), leading to a contradiction.
Conclusion: Consciousness can emerge from pure static data. Time and subjective experience (\(S\)) must emerge solely from information.
Consider a multi-threaded computer running two DNA simulations, \(\tau_A\) (Alice) and \(\tau_B\) (Bob), concurrently, with a minimal time slice of one CPU cycle per thread. The execution trace of the computer is then an interleaved sequence of segments drawn from both \(\tau_A\) and \(\tau_B\). As the number of threads (simulated observers) increases without bound, the resulting execution trace asymptotically approaches white noise. Consequently, each bit originating from \(\tau_A\) are separated by increasingly large intervals of unrelated data.
Is Alice still conscious in this limit? If one maintains that \(E_{\text{int}}\) vanishes as the number of threads increases, one must define a specific thread density or bit-contiguity threshold required for subjectivity. Such a threshold would constitute a new physical constant, which contradicts the completeness of \(A_2\).
Conclusion: Conscious experience can arise from static white noise.
Assumption (Objection): A simulation \(H'\) exists such that \(H' \equiv H\) (physical/behavioral equivalence) but \(S(H') = \emptyset\) (lacks consciousness/sense of pain) (Chalmers 1996). \[\text{Behavior}(H') = \text{Behavior}(H) \land S(H') \neq S(H)\]
Premise: From \(A_4\), the behavior of \(H\) is a function of its physical inputs and its subjective experience: \(\text{Behavior}(H) = f(\text{Inputs}, S)\).
Contradiction: If the behaviors are identical despite the difference in \(S\), then \(S\) must not be a necessary input to the function \(f\).
Violation of Axiom: If \(S\) is not necessary to produce the behavior, then \(S\) is epiphenomenal (causally inert). This directly contradicts \(A_4\).
Conclusion: To maintain the integrity of \(A_4\) within the axiomatic system, the simulation \(H'\) must experience subjective time and pain.
The hypothesis is falsifiable in the future when technology advances and DNA simulations can be run with sufficient accuracy for DNA-based organisms. The effect of pain can be measured just like an effect of physical forces can be measured. If a DNA simulation \(H'\) is constructed and shown to lack \(S\), then \(A_4\) is invalidated, and the axioms 1–3 collapse.
A common objection to functionalist and computational accounts of mind is the so-called hard problem of consciousness (Chalmers 1996). In this context, it is often claimed that a simulated human, even if behaviorally and functionally identical to a biological human, could nevertheless be “hollow”—that is, it could lack subjective experience while still producing identical outward behavior.
Formally, this objection asserts the possibility of a system \(H'\) such that \[\text{Behavior}(H') = \text{Behavior}(H) \quad \text{and} \quad S(H') = \emptyset,\] where \(H\) is a biological human and \(S\) denotes subjective experience.
By Axiom 2, the biological human \(H\) is a formal axiomatic system whose state transitions are fully determined by physical law and initial conditions. By Axiom 3, there exists a formal representation \(H'\) whose execution trace is isomorphic to that of \(H\), preserving all causally relevant internal relations. By Axiom 4, subjective experience is a causally efficacious component of the system, contributing to observable behavior.
If \(H'\) were to lack subjective experience while remaining behaviorally identical to \(H\), then subjective experience would not be a necessary input to the causal function generating behavior. This would render subjective experience epiphenomenal, directly contradicting Axiom 4.
We have considered two thought experiments run on classical computers. If conscious experience can be duplicated on classical computers, we argue it can also be duplicated on quantum computers. This follows directly from substrate independence: quantum computation represents an alternative physical realization of the same formal state-transition structure.
As long as physical laws are instantiated with sufficient fidelity for DNA and organismal processes to operate, the observer remains conscious.
Let \(\epsilon\) denote the numerical error in simulating physical law. As long as \(\epsilon \leq \epsilon_{\text{DNA}}\), sufficient to reproduce all DNA-mediated processes, the simulated observer \(H'\) remains conscious; any smaller \(\epsilon_{\min}\) required to preserve consciousness would constitute a new physical constant, contradicting \(A_2\): \[\epsilon \leq \epsilon_{\text{DNA}} \implies S(H') \neq \emptyset\]
Substrate Irrelevance: The universe is fundamentally informational and abstract rather than substrate-dependent.
Time: Time is not an attribute of (\(E_{\text{ext}}\)) but emerges solely as an internal property (\(E_{\text{int}}\)) of the observer.
Emergence from Noise: Information that appears random can encode an universe containing a conscious observer.
Ontological Equivalence: There exists \(2^n-2\) other configurations, potentially describing conscious observers.
By treating subjective experience (\(S\)) as a property of an axiomatic system, we move from metaphysical speculation to a rigorous informational mechanics.