Algorithmic Information Theory

Reference: Peter D. Grünwald and Paul M.B. Vitányi (2008). arXiv:0809.2754v2 (book chapter, Handbook of Philosophy of Information). Source file: 0809.2754v2 (1).pdf. URL

Summary

A tutorial survey of algorithmic information theory (Kolmogorov complexity) and its relationship to Shannon information theory. The authors explain how the amount of information in an individual object can be measured by the length of the shortest program that produces it, and how this quantity — invariant up to an additive constant — supports a non-probabilistic theory of information.

The chapter covers the invariance theorem, uncomputability, relation to entropy, mutual information, the Kolmogorov structure function (separating meaningful/structural from random information), Minimum Description Length, normalized compression distance, and the philosophical implications for Occam’s razor and inductive inference.

Key Ideas

  • Kolmogorov complexity K(x) as length of shortest program outputting x
  • Invariance theorem makes K objective up to an additive constant
  • Structure function separates meaningful (model) from random content
  • MDL as a practical approximation of ideal algorithmic inference
  • Entropy is expected Kolmogorov complexity under computable distributions

Connections

Conceptual Contribution

Tags

#information-theory #kolmogorov-complexity #mdl #foundations

Backlinks