Does evolution have a built-in Occam’s razor?
Darwinian evolution proceeds by natural selection acting on random variation. In this talk I will argue that random mutations can generate patterns in the arrival of novel phenotypic variation that can dramatically shape the spectrum of adaptive outcomes. The basic intuition follows from an algorithmic twist on the infinite monkey theorem, inspired by the fact that natural selection doesn’t act directly on mutations, but rather on the phenotypes that are generated by developmental programmes. If the monkeys type at random in a computer language, they will preferentially produce outputs that can be generated by shorter algorithms. This intuition can be formalised with the coding theorem of algorithmic information theory. It predicts that random mutations are exponentially more likely to produce simpler phenotypes with low descriptional (Kolmogorov) complexity. Evidence for this evolutionary Occam’s razor will be presented the prevalence of symmetry in protein complexes, for simplicity in RNA secondary structures, gene regulatory networks, leaf shape, and Richard Dawkins’biomorphs model of development. A similar principle may apply to learning and help explain why neural networks generalise well on typical datasets.