Algorithmic information theory (AIT), or Kolmogorov complexity as it is known to mathematicians, can provide a useful tool for scientists to look at natural systems, however, some critical conceptual issues need to be understood and the advances already made collated and put in a form accessible to scientists. This book has been written in the hope that readers will be able to absorb the key ideas behind AIT so that they are in a better position to access the mathematical developments and to apply the ideas to their own areas of interest. The theoretical underpinning of AIT is outlined in the earlier chapters, while later chapters focus on the applications, drawing attention to the thermodynamic commonality between ordered physical systems such as the alignment of magnetic spins, the maintenance of a laser distant from equilibrium, and ordered living systems such as bacterial systems, an ecology, and an economy.
Key Features
- Presents a mathematically complex subject in language accessible to scientists
- Provides rich insights into modelling far-from-equilibrium systems
- Emphasises applications across range of fields, including physics, biology and econophysics
- Empowers scientists to apply these mathematical tools to their own research
Author(s): Sean D. Devine
Publisher: Iop Publishing
Year: 2020
Language: English
Pages: 238
City: Bristol
PRELIMS.pdf
Preface
Reference
Author biography
Sean D Devine
Symbols
CH001.pdf
Chapter 1 Introduction
1.1 Brief outline of the book
1.2 What are complex systems?
1.3 Some approaches to complex or organised systems
1.4 Algorithmic information theory (AIT)
1.4.1 Algorithmic information theory and entropy
1.4.2 Problems with AIT
1.5 Algorithmic information theory and mathematics
1.6 Real-world systems
1.6.1 AIT concepts need to explore the natural world
References
CH002.pdf
Chapter 2 Computation and algorithmic information theory
2.1 The computational requirements for workable algorithms
2.2 The Turing machine
2.2.1 Alternative Turing machines
2.2.2 The universal Turing machine
2.2.3 Non-computable functions
2.3 Measure theory
2.3.1 Cylinder sets
References
CH003.pdf
Chapter 3 AIT and algorithmic complexity
3.1 Shorter algorithms imply order
3.2 Machine dependence and the invariance theorem
3.2.1 Issues with AIT
3.3 Self-delimiting coding and the Kraft inequality
3.4 Optimum coding and Shannon’s noiseless coding theorem
3.4.1 Delimiting coding of a natural number
3.4.2 The relationship with entropy, conditional entropy, etc
3.5 Entropy relative to the common framework
3.6 Entropy and probability
3.7 The fountain of all knowledge: Chaitin’s Omega
3.8 Gödel’s theorem and formal axiomatic systems
3.9 The algorithmic entropy, the universal semi-measure and inference
3.9.1 Inference and and the universal distribution
References
CH004.pdf
Chapter 4 The algorithmic entropy of strings with structure and variation
4.1 Identical algorithmic approaches to strings with variation
4.2 The provisional entropy
4.3 The specification by a probability distribution
4.3.1 The algorithmic minimal sufficient statistics approach to the provisional entropy
4.3.2 Summing up
4.3.3 A simple example of the provisional entropy
4.3.4 Hidden structure and the term ‘provisional’
4.3.5 Algorithmic entropy not critically dependent on detail
4.4 How to specify noisy data
4.5 The non-typical state and the thermodynamic entropy
References
CH005.pdf
Chapter 5 Modelling and the minimum description length
5.1 Introduction
5.2 The algorithmic entropy approach to modelling
5.2.1 A more general example
5.2.2 The AMSS application to model selection
5.3 The minimum description length approach
5.3.1 Ideal MDL
5.3.2 Example of a non-typical string in a set
5.3.3 Practical MDL
5.3.4 Maximum entropy formulation
References
CH006.pdf
Chapter 6 The non-typical string and randomness
6.1 Outline on perspectives on randomness
6.2 Martin-Löf test of randomness
Martin-Löf test examples
6.2.1 Randomness deficiency as a Martin-Löf test
6.2.2 The Martin-Löf universal P-test of randomness
6.2.3 The sum P-test
6.2.4 The use of universal concepts
References
CH007.pdf
Chapter 7 Order and entropy
7.1 The meaning of order
7.2 Algorithmic entropy and the traditional entropy
7.2.1 Specification of the state space
7.2.2 Boltzmann, Gibbs, Shannon and algorithmic entropy
7.2.3 Zurek’s physical entropy, missing information and the algorithmic description
7.2.4 The consistency of the provisional entropy
References
CH008.pdf
Chapter 8 Reversibility, and Landauer’s principle
8.1 Introduction
8.2 Landauer’s principle
8.3 Outline of Landauer’s argument
8.4 The simulation of a reversible real-world computation
8.4.1 Maintaining reversibility in a laboratory UTM
8.5 The algorithmic entropy as a function of state
8.5.1 When bits are tracked, bits are conserved
8.5.2 Must the algorithm that specifies the provisional entropy halt?
8.5.3 Tracking bit flows in non-halting computations
8.6 External interventions to restore a degraded system
References
CH009.pdf
Chapter 9 The algorithmic equivalent of the second law of thermodynamics
9.1 The meaning of ‘equilibrium’
9.1.1 The source of bits in different scenarios
9.2 The increase in thermodynamic entropy as a system trends to the most probable set of states
9.3 The relationship between algorithmic entropy and the thermodynamic entropy of a macrostate
9.3.1 Simple models illustrating isolated systems
9.3.2 General approach to specifying all states in an isolated system
References
CH010.pdf
Chapter 10 How replication processes maintain a system far from the most probable set of states
10.1 Maintaining a system distant from the equilibrium
10.1.1 Brief summary of the earlier exposition of AIT
10.2 Examples of the computational issues around the degradation of simple systems
10.3 Entropy balances in a reversible system
10.4 Homeostasis and second law evolution
10.4.1 Information requirements of homeostasis
10.5 Replication processes generate order to counter the second law of thermodynamics
10.5.1 Replication as a process that creates ordered structures
10.6 Simple illustrative examples of replication
10.6.1 Replicating spins
10.6.2 Coherent photons as replicators
10.7 The algorithmic entropy cost of replica variations
10.7.1 Natural ordering through replication processes
10.8 A replicating living system
10.9 Selection processes to sustain a natural system
10.9.1 Adaptation and interdependence of replicas in an open system
10.10 Summary of system regulation and AIT
References
CH011.pdf
Chapter 11 Sustainability requirements of a viable economy distant from equilibrium
11.1 Introduction
11.2 A reminder of the principles of AIT
11.2.1 Natural laws as computations
11.2.2 An illustration based on an image on a screen
11.2.3 Replication processes create natural order
11.3 An economy seen as a replicating system
11.3.1 The contrast with a neoclassical economy
11.4 Order creation through the know-how of economic agents
11.5 A narrative to capture economic development
11.5.1 The hypothetical steps to greater economic sophistication
11.5.2 Independent agents
11.5.3 Agent adaptation
11.5.4 Trade increases order
11.5.5 Trade in artefacts
11.5.6 Order created by innovation, tools and machines
11.5.7 Amalgamation and nesting of agents to form new agent types
11.6 Are there resource limits to economic growth?
11.7 Order and GDP
11.8 Complementary approaches to human systems
11.8.1 Control information
11.8.2 Energy flows in low entropy systems
11.9 Implications of the algorithmic approach
11.10 Conclusions for economic systems
References
CH012.pdf
Chapter 12 AIT and philosophical issues
12.1 Algorithmic descriptions, learning and artificial intelligence
12.2 The mathematical implications of algorithmic information theory
12.3 How can we understand the Universe?
Why can we make any sense of the Universe?
12.4 Closing thoughts
References