Theory of Self-Reproducing Automata (Fourth Lecture: The Role of High and of Extremely High Complication)

Reference: von Neumann, J. (edited and completed by Arthur W. Burks) (1966). University of Illinois Press. Source file: VonNeumann.pdf. URL

Summary

This excerpt is the Fourth Lecture of von Neumann’s posthumously edited Theory of Self-Reproducing Automata. Von Neumann compares natural automata (nervous systems) with artificial computing machines across size, speed, energy dissipation per elementary act of information, and error characteristics. He observes that although vacuum tubes are vastly larger and less energy-efficient than neurons, both are far above the thermodynamic minimum — suggesting physics does not fully explain the size gap; reliability likely does.

The core argument concerns complication: below a threshold, a system cannot perform certain tasks at all; above it, qualitatively new behaviors (including self-reproduction and evolution) become possible. Natural automata tolerate errors locally rather than halting on any single fault, an architectural stance he contrasts with the “single-error” fragility of contemporary computers. The discussion foreshadows modern views on redundancy, fault tolerance, and emergent capabilities with scale.

Key Ideas

  • Complication threshold enables qualitatively new behavior.
  • Natural automata survive local errors; artificial automata halt.
  • Analog-digital mixture characterizes biological computation.
  • Size and reliability trade-offs shape architecture.
  • Precursor to self-reproducing and evolvable systems theory.

Connections

Conceptual Contribution

Tags

#automata #complexity #fault-tolerance #foundational

Backlinks