Computation · the universe as process
If information says distinctions are real, computation says transformations of distinctions are how reality runs.
Kernel
Computation is the layer at which information undergoes rule-governed change. Alan Turing's 1936 universal machine showed that any computable transformation of any sequence of distinctions can be performed by a single, simple device. The implication is enormous: there is a structural unity to all rule-following systems, from a thermostat to a brain to a galaxy. The radical version of the thesis — the universe is, at its base, a single ongoing computation — has had three principal advocates: Konrad Zuse (Calculating Space, 1969), Edward Fredkin (digital physics, 1990s), and Stephen Wolfram (A New Kind of Science, 2002; the Wolfram Physics Project, 2020). Whether it is correct or only useful is the central open empirical question of this layer.
Turing machines and Church's thesis
Turing's 1936 paper defines a machine with a finite state, a movable head, and an unbounded tape. It then proves that this minimal device can compute anything that any other device can compute — including itself. Alonzo Church independently and simultaneously formulated the same result in lambda calculus. The Church–Turing thesis says: "computable" means "computable by a Turing machine." Sixty years of attempts to find a counter-example have produced none. Every quantum algorithm, every neural network, every cellular automaton can be simulated by a Turing machine, slower but exactly. This is one of the most stable empirical regularities in any science.
Cellular automata and Wolfram's claim
Stanislaw Ulam and John von Neumann invented cellular automata at Los Alamos in 1947 as a substrate for self-replicating systems. John Conway's Game of Life (1970) showed that two simple rules can produce structures that are Turing-complete. Stephen Wolfram in A New Kind of Science (2002) classified one-dimensional cellular automata and made a stronger claim: that the universe at sufficient depth resembles such a rule-system, and that what we call physics is the macroscopic statistics of an enormous computational substrate running underneath. The 2020 Wolfram Physics Project proposes specific candidate rule-systems — hypergraph rewriting — and shows that general relativity and quantum mechanics emerge as statistical limits of such systems. The proposal is unproven but no longer obviously crank.
The brain as a biological computer
The McCulloch–Pitts paper of 1943 was the first formalization of a neuron as a logical-gate-like computational unit. Decades of subsequent neuroscience have not refuted the basic picture; they have refined it. The human brain is approximately 86 billion neurons, each making thousands of synaptic connections, performing roughly 10^15 operations per second at 20 watts. Whether this constitutes "a computer" in the literal Turing sense is partly a definitional question. What is not in dispute is that the same mathematical formalisms — linear algebra, dynamical systems, information theory, statistical inference — describe both neural computation and silicon computation, and that the 2020s success of deep neural networks is the strongest empirical evidence to date that the brain's computational principles are at least partly transferable.
AI and the computation explosion
By 2026, frontier AI systems have been trained on more than 10^25 floating-point operations — a number larger than the total computation performed by humanity before 1980. The compute used per frontier training run roughly doubles every six months. The civilizational consequences are large enough that the previous layers (technology, science, mathematics) are being restructured by them in real time. Whether what we are seeing is "computation reaching a phase transition" or "engineering scaling a known curve until it stops" is the most consequential open question of the late 2020s. The Wolfram and Hutter readings of computation predict that the curve does not stop until it hits the substrate of physics itself.
Open questions on this layer
- — Is the universe a computation, or only describable by one?
- — Are there computations that the universe cannot perform?
- — Is AI a new computational species or a tool?