Handbook of Unconventional Computing: Theory / Implementations (2-Volume Set)

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Did you know that computation can be implemented with cytoskeleton networks, chemical reactions, liquid marbles, plants, polymers and dozens of other living and inanimate substrates? Do you know what is reversible computing or a DNA microscopy? Are you aware that randomness aids computation? Would you like to make logical circuits from enzymatic reactions? Have you ever tried to implement digital logic with Minecraft? Do you know that eroding sandstones can compute too?This volume reviews most of the key attempts in coming up with an alternative way of computation. In doing so, the authors show that we do not need computers to compute and we do not need computation to infer. It invites readers to rethink the computer and computing, and appeals to computer scientists, mathematicians, physicists and philosophers. The topics are presented in a lively and easily accessible manner and make for ideal supplementary reading across a broad range of subjects.

Author(s): Andrew Adamatzky
Series: WSPC Book Series in Unconventional Computing
Publisher: World Scientific Publishing
Year: 2021

Language: English
Pages: 1206
City: Singapore

Volume 1 : Theory
Contents
Preface
Chapter 1. Mapping the Territory of Computation Including Embodied Computation
1.1. Unconventional Computation
1.1.1. Implications
1.2. Computation in General
1.2.1. Topology of information representation
1.2.2. Topology of information processing
1.2.3. Programmability
1.2.4. Universality
1.3. Embodied Computation
1.3.1. Definition
1.3.2. Physics for computational purposes
1.3.2.1. Transduction
1.3.2.2. Analog computation
1.3.2.3. Quantum computation
1.3.2.4. Field computation
1.3.3. Computation for physical purposes
1.4. Programmable Matter
1.5. Artificial Morphogenesis
1.6. Conclusions
References
Chapter 2. Reversible Logic Element with Memory as an Alternative Logical Device
2.1. Introduction
2.2. Reversible Logic Element with Memory (RLEM)
2.2.1. Definition of a reversible logic element with memory
2.2.2. Rotary element (RE), a typical RLEM
2.2.3. Constructing reversible machines by REs
2.3. All Non-degenerate 2-State RLEMs Except Four are Universal
2.4. Realizing RLEMs in Reversible Environments
2.4.1. RLEMs in the billiard ball model
2.4.2. RLEM in a simple reversible cellular automaton
2.5. Concluding Remarks
References
Chapter 3. Space Bounded Scatter Machines
3.1. Introduction
3.2. First Concepts and Promised Results
3.3. The Experiment, the Protocols and the Machine
3.4. The Standard Scatter Machine
3.4.1. Probabilistic trees
3.4.2. Sparse oracles
3.4.3. Coding and decoding the vertex position
3.4.4. Lower bounds
3.4.5. Upper bounds
3.5. The Generalized Scatter Machine
3.5.1. Lower bounds
3.5.2. Upper bounds
3.6. Conclusion
Acknowledgments
References
Chapter 4. Exclusively Quantum: Computations Only Possible on a Quantum Computer
4.1. Introduction
4.2. Background: Parallelism and Quantum Theory
4.2.1. On the importance of parallelism
4.2.2. Quantum computation and quantum information
4.2.2.1. The qubit
4.2.2.2. Measurements
4.2.2.3. Putting qubits together
4.2.2.4. Entanglement
4.3. Some History
4.3.1. Quantum versus classical computation
4.3.2. A review of previous results
4.3.2.1. True randomness
4.3.2.2. Entangled states
4.3.2.3. Quantum speed-up
4.3.2.4. Quantum simulations
4.3.2.5. QTM versus DTM and PTM
4.3.2.6. Quantum versus classical complexity
4.3.2.7. Super-Turing computations
4.4. Unconventional Computations and Non-universality
4.5. Evolving Computations
4.5.1. Evolving computational complexity
4.5.1.1. Rank-varying computational complexity
4.5.1.2. Time-varying computational complexity
4.5.2. Evolving computational variables
4.5.2.1. Time-varying variables
4.5.2.2. Interacting variables
4.5.2.3. Computations obeying a global constraint
4.6. Quantum Fourier Transform
4.6.1. Rank-varying complexity
4.6.2. Semi-classical solution
4.6.3. Parallel approach
4.6.4. Quantum decoherence
4.6.5. Time-varying variables
4.7. Quantum Error-correction
4.7.1. Quantum codes
4.7.2. Time-varying complexity
4.7.3. Error correction via symmetrization
4.8. Entanglement Revisited
4.8.1. Interacting variables
4.8.2. Quantum distinguishability
4.8.2.1. Generalization
4.8.3. Another approach to distinguishability
4.8.4. Some consequences
4.8.4.1. Conveying quantum information through a classical channel
4.8.4.2. Protecting quantum information from classical attacks
4.8.5. Transformations obeying a global constraint
4.9. Conclusion
References
Chapter 5. Estimations of Integrated Information Based on Algorithmic Complexity and Dynamic Querying
5.1. Introduction
5.2. Basic Concepts of Integrated Information Theory
5.2.1. Calculus of ϕ
5.3. Methods
5.3.1. Programmability test and meta-test
5.3.2. Causal perturbation analysis
5.3.3. Causal influence and sublog program-size divergence
5.3.4. A simplicity versus complexity test
5.4. Numerical Results
5.4.1. Compression sensitivity as informative of integration
5.4.2. Finding simple rules in complex behavior
5.4.3. Simple rules and the black pattern of distribution of information
5.4.3.1. Automatic meta-perturbation test
5.4.3.2. Shrinking after dividing to rule
5.4.4. Limitations
5.5. Conclusions and Future Directions
References
Appendix A
A.1. How a meta-perturbation test works
Appendix B
Chapter 6. Robot Narratives
6.1. Introduction
6.2. Robots Explore an Alien Planet
6.2.1. Prologue
6.2.2. Greek Olympus, narrative robots
6.2.3. Roman Olympus, declarative robots
6.2.4. Epilogue
6.3. Robot Imaginations
6.3.1. Dennett’s Tower
6.3.2. Robots in the tower
6.3.3. An architecture for an ethical Popperian robot
6.3.4. From Popperian to Gregorian robots
6.3.5. Narrative logic
6.4. Gregorian Chat
6.4.1. The narrative hypothesis in more detail
6.4.2. The Gregorian chat system
6.4.2.1. Robot architecture
6.4.2.2. Hybrid physical/virtual environment architecture
6.4.3. Testing the narrative hypothesis in a robot ecology
6.4.4. Extensions of the approach
6.5. Discussion and Conclusions
6.5.1. Communication and cognition
6.5.2. Social robots
6.5.3. Narrative logic and its interface with world modeling in artificial intelligence
6.5.4. Beyond a “repository of actions”: The particular and the general in narrative
6.5.5. Story generator and story parser
6.5.6. Preparing for the future
Acknowledgments
References
Chapter 7. Evolving Boolean Regulatory Networks with Variable Gene Expression Times
7.1. Introduction
7.2. The RBNK Model
7.3. The RBNK Model with Variable Gene Expression Times
7.3.1. Gene expression
7.3.2. Experimentation
7.3.3. Asynchronous experimentation
7.4. Variable Sized GRN with Variable Gene Expression Times
7.4.1. Emergent complexity
7.4.2. Experimentation
7.5. Conclusions
References
Chapter 8. Membrane Computing Concepts, Theoretical Developments and Applications
8.1. Introduction
8.2. Types of P Systems
8.2.1. Preliminaries
8.2.2. Transition P systems
8.2.3. P systems with active membranes
8.2.4. Communication P systems
8.2.5. Tissue-like P systems with symport/antiport rules
8.2.6. Spiking neural P systems
8.2.7. Enzymatic numerical P systems
8.3. Computing Power and Computational Efficiency
8.3.1. Introduction
8.3.2. Computing power
8.3.2.1. Rewriting membrane systems
8.3.3. Computational efficiency
8.3.3.1. Recognizer membrane systems
8.3.3.2. Polynomial time complexity classes
8.3.3.3. Limits on efficient computations
8.3.3.4. Solving computationally hard problems
8.4. Applications of Membrane Computing
8.4.1. Modeling ecosystems with population dynamics P systems
8.4.2. Path planning and control of mobile robots
8.4.3. Fault diagnosis with spiking neural P systems
8.4.4. Other applications
8.4.5. Concluding remarks
8.5. Implementation of P Systems
8.5.1. Software implementations
8.5.2. GPU-based hardware implementations
8.5.2.1. GPU computing
8.5.2.2. GPU simulators of P systems
8.5.3. FPGA-based hardware implementations
8.5.4. Discussion
8.6. Concluding Remarks
References
Chapter 9. Computing with Modest Resources: How to Have Your Cake and Eat it Too
9.1. Introduction
9.2. Why Bigger is Not Necessarily Smarter
9.3. Why Smaller Might be Smarter
9.4. What is External Drive and How Does It Help
9.5. A Thought Experiment: Maxwell’s Daemon Rebooted
9.6. Example: Memristor Networks
9.6.1. Memristor model
9.7. Conclusions
References
Chapter 10. Physical Randomness Can Help in Computations
10.1. Introduction
10.2. Randomness is Important
10.3. How Can We Describe Randomness in Precise Terms?
10.4. Back to Physics
10.5. How Does This Affect Computations
Acknowledgments
References
Chapter 11. Probabilistic Logic Gate in Asynchronous Game of Life with Critical Property
11.1. Introduction
11.2. Asynchronous Game of Life and its Logic Gates
11.2.1. Phase transition and criticality
11.2.2. Computation by asynchronous GL
11.2.3. Logic gate in asynchronous GL
11.3. Discussion
11.4. Conclusion
Acknowledgements
References
Chapter 12. A Mystery of Human Biological Development — Can It Be Used to Speed up Computations?
12.1. Formulation of the Problem
12.2. Exponential Speedup Phenomenon: A Brief History and a Brief Description
Acknowledgments
References
Chapter 13. Symmetric Automata and Computations
13.1. Introduction
13.2. Structure of Abstract Automata and Instruction Machines
13.3. Structure of Symmetric Automata and Machines
13.4. Functional Characteristics of Operationally Symmetric Turing Machines
13.5. Conclusion
References
Chapter 14. Computation By Biological Means
14.1. Introduction
14.1.1. Computation
14.1.2. Use of biology for computation
14.1.3. What can biocomputers do?
14.2. DNA Computing
14.2.1. Origins of DNA computing
14.2.2. Computing with DNA
14.2.3. Adleman’s experiment
14.2.4. Sticker systems
14.2.5. Recent progress on computational power of DNA
14.2.6. Limitations of DNA
14.3. Computation with Slime Mould
14.3.1. Introduction to slime moulds
14.3.2. Slime moulds for computational tasks
14.3.3. Amoeboid organisms as a computer
14.3.4. Slime moulds as a form of computation
14.3.5. Select applications of slime moulds
14.3.6. Challenges of computing with slime moulds
14.4. Computation with Motile Biological Agents
14.4.1. Molecular motors
14.4.2. Use of biological motion in confined structures
14.4.3. Issues with mobile biologicals
14.5. Computation with Synthetic Biology
14.5.1. Chemotaxis
14.5.2. Saccharum saccharomyces and genetic toggles
14.5.3. Cellular computation
14.5.4. Multicellular machines
14.6. Differences in Biological Computing Paradigms
14.7. Discussion
14.7.1. Critical research gaps in biological computing
14.7.2. The case for biosupremacy
References
Chapter 15. Swarm and Stochastic Computing for Global Optimization
15.1. Introduction
15.2. Optimization
15.3. Stochastic Enhancements
15.4. Evolutionary Computation
15.5. Nature-Inspired Computing
15.5.1. Non-SI-based approaches
15.5.2. SI-based approaches
15.6. Discussions
References
Chapter 16. Vector Computation
16.1. Epistemology versus Ontology in the Quantum Computation Context
16.2. Types of Quantum Oracles for Randomness: Pure States in a Superposition Versus Mixed States
16.3. Questionable Parallelism by Views on a Vector
16.4. Computation by Projective Measurements of Partitioned State Space
16.5. Entanglement as Relational Parallelism Across Multi-Partite States
16.6. On Partial Views of Vectors
16.7. Summary
Acknowledgments
References
Chapter 17. Unsupervised Learning Approach Using Reinforcement Techniques on Bio-inspired Topologies
17.1. Introduction
17.1.1. Molecular networks
17.1.2. Cellular automata
17.1.3. Conway’s Game-of-Life
17.1.4. Neuromorphic computing systems
17.2. Molecular-based Topology
17.3. Artificial Neural Networks
17.4. Neuron Model
17.4.1. Simple Izhikevich model
17.5. Excitation Reinforcement
17.5.1. Majority-rule
17.5.2. Game-of-Life rule
17.6. Unsupervised Learning
17.6.1. Majority-rule learning
17.6.2. Game-of-Life learning
17.6.3. Hebbian learning
17.7. Training and Classification
17.8. Results
17.9. Discussion
Acknowledgement
References
Chapter 18. Intelligent Gamesourcing — Artificial Intelligence in Problem Solution by Game Playing
18.1. Introduction
18.1.1. Gamesourcing
18.1.2. History
18.2. Motivation
18.2.1. Leisure
18.2.2. Game playing
18.2.3. Paradigm
18.2.4. Structure
18.2.5. Fun
18.2.6. Credibility of outputs
18.2.7. Evaluation of success
18.2.8. Current status
18.2.8.1. 1.4.1 Astro Drone
18.2.8.2. EteRNA
18.2.8.3. Foldit
18.2.8.4. Play to cure: Genes in space
18.2.8.5. Sea Hero Quest
18.3. Application — People versus Differential Evolution in Search of the Shortest Path
Acknowledgment
References
Index
Volume 2 : Implementations
Contents
Preface
Chapter 1. From Oscillatory Reactionsto Robotics: A Serendipitous Journey Through Chemistry, Physics and Computation
1.1. Introduction
1.2. Systems Dynamics as a Computational Platform
1.2.1. Wet oscillating systems
1.2.2. Electrochemical oscillators
1.3. Computation and Control in Dynamic Systems
1.3.1. Computation in memristive devices and systems
1.3.1.1. Logic design with memristors/memristive devices
1.3.1.2. Matrix vector multiplication
1.3.1.3. Hardware artificial neural networks
1.3.2. Principles of control in dynamic systems — PID case
1.3.3. Reservoir computing
1.3.4. Reservoir computing and control systems
1.4. Controllers Beyond PID: Fuzzy and Neuromorphic
1.4.1. Fuzzy logic
1.4.2. Processing fuzzy logic by using molecules
1.4.3. Implementation of fuzzy logic systems in solid-state devices
1.4.4. Neuromorphic devices
1.5. Alternative Computing in Autonomous Robotics
1.5.1. Amoeba-based solution search system and electronic amoeba
1.5.2. Amoeba-inspired autonomously walking robot
1.5.3. Physicality for the identification of ground condition
1.5.4. Integration of reinforcement learning for efficient walking
1.6. Alternative Computing and Soft Robotics
1.7. Concluding Remarks
Acknowledgments
References
Chapter 2. Computing by Chemistry: The Native Chemical Automata
2.1. Introduction
2.2. A Brief History
2.3. How Native Chemical Automata are Built in Practice
2.3.1. Selecting the language-automata pair, and the chemistry
2.3.2. Initial conditions and alphabet symbol assignment
2.3.3. Recipe quantification and selection of time interval
2.3.4. Accept/reject criteria optimization
2.4. Reconfiguration and Variants/Extension of Native Chemical Automata
2.4.1. Inclusive hierarchy and automata reconfigurability
2.4.2. Extension to continuous operation (CSTR reactor)
2.4.3. Coupling of Belousov–Zhabotinsky to self-assembly
2.5. Conclusions and Outlook
References
Chapter 3. Discovering Boolean Functions on Actin Networks
3.1. Introduction
3.2. The Actin Network
3.3. Spike-based Gates
3.3.1. Automaton model
3.3.2. Interfacing with the network
3.3.3. Maximizing a number of logical gates
3.3.4. Actin droplet machine
3.4. Voltage-based Gates
3.4.1. The model
3.4.2. Extension to bundle networks
3.4.3. The network
3.4.4. Preliminary results
3.4.5. Results
3.4.5.1. Ideal electrodes
3.4.5.2. Boolean gates
3.4.5.3. Realistic electrodes
3.4.6. Finite state machine
3.4.6.1. Using two values of k = 4 and k = 6
3.5. Discussion
Acknowledgments
References
Chapter 4. Implication and Not-Implication Boolean Logic Gates Mimicked with Enzyme Reactions — General Approach and Application to Signal-Triggered Biomolecule Release Processes
4.1. Introduction
4.2. Mimicking IMPLY Logic Gate
4.3. Mimicking INHIB Logic Gate
4.4. Using the IMPLY and INHIB Logic Gates for Stimulating Molecule Release Function
4.5. Conclusions
Appendix
Acknowledgments
References
Chapter 5. Molecular Computation via Polymerase Strand Displacement Reactions
5.1. Introduction
5.1.1. Logic circuits
5.1.2. Chemical reaction networks
5.1.3. Chapter organization
5.2. Strand Displacement
5.3. Using Strand Displacing Polymerase for Computation
5.4. CRNs Using Strand Displacing Polymerase
5.5. Methods and Protocol
5.5.1. Oligonucleotide design, synthesis, and purification
5.5.2. Fluorescence sample preparation and measurement
5.6. Discussion and Outlook
References
Chapter 6. Optics-Free Imaging with DNA Microscopy: An Overview
6.1. Introduction
6.2. DNA Microscopy for Surface 2D Imaging
6.3. DNA Microscopy for Volumetric 3D Imaging
6.4. Conclusions and Outlook
Acknowledgments
References
Chapter 7. Fully Analog Memristive Circuits for Optimization Tasks: A Comparison
7.1. Introduction
7.2. Dynamical Equation for Memristor Circuits
7.2.1. Single memristor and Lyapunov function
7.2.2. Circuits
7.2.3. Lyapunov function for memristor circuits
7.2.4. Number of fixed points and stability
7.3. Analysis and Comparisons
7.3.1. The instances
7.3.2. Minimization of the continuous Lyapunov function
7.4. Conclusions
Acknowledgments
References
Chapter 8. Organic Memristive Devices for Bio-inspired Applications
8.1. Introduction
8.2. Organic Memristive Device
8.3. Adaptive Circuits
8.4. Relationship of Optical and Electrical Properties
8.5. Neuromorphic Applications
8.5.1. Frequency dependent plasticity
8.5.2. Nervous system mimicking circuits
8.5.3. Towards synapse prosthesis
8.5.4. Stochastic self-organized computational systems
8.6. Conclusions
References
Chapter 9. On Wave-Based Majority Gates with Cellular Automata
9.1. Introduction
9.2. Propagation Patterns in Life-like Rules
9.3. MAJORITY Gates by Competing Patterns
9.4. Final Notes
References
Chapter 10. Information Processing in Plants: Hormones as Integrators of External Cues into Plant Development
10.1. Introduction
10.2. Hormonal Encoding of Environmental Information
10.3. Self-regulatory Hormonal Network Underlying Plasticity in Plant Development
10.3.1. Information processing in the transition from dormancy to germination
10.4. Concluding Remarks
References
Chapter 11. Hybrid Computer Approach to Train a Machine Learning System
11.1. Introduction
11.1.1. A brief introduction to artificial intelligence and machine learning
11.1.2. Analog versus digital computing
11.1.3. Balancing an inverse pendulum using reinforcement learning
11.2. The Analog Simulation
11.3. The Reinforcement Learning System
11.3.1. Value function
11.3.2. Q-learning algorithm
11.3.3. Python implementation
11.3.3.1. States
11.3.3.2. Actions
11.3.3.3. Modeling the action value function Q(s, a)
11.3.3.4. Feature transformation to avoid underfitting
11.3.3.5. Decaying α and ε to improve learning and to avoid overfitting
11.3.4. Hybrid interface
11.4. Results
Acknowledgement
References
Chapter 12. On the Optimum Geometry and Training Strategy for Chemical Classifiers that Recognize the Shape of a Sphere
12.1. Introduction
12.2. The Evolutionary Optimization of Chemical Classifiers
12.3. Results
12.4. Conclusions
Acknowledgments
References
Chapter 13. Sensing and Computing with Liquid Marbles
13.1. Introduction
13.2. Collision-based Gate
13.3. Belousov–Zhabotinsky Cargo
13.4. Photosensor
13.5. Thermal Sensor
13.6. Robot Controller
13.7. Neuromorphic Marbles
13.8. Discussion
Acknowledgments
References
Chapter 14. Towards Colloidal Droplet Processors
14.1. Background
14.1.1. Data throughput
14.1.2. Heat dissipation
14.1.3. Energy consumption
14.2. Liquid Droplet Computer
14.2.1. Holonomic processors
14.3. Colloidal Processors
14.3.1. Phase change architectures
14.3.2. Microfluidic circuits
14.3.3. Layered shells
14.3.4. Granular media, foams, and plasmas
14.4. Biological Processors
14.5. Conclusions
Acknowledgment
References
Chapter 15. Biomolecular Motor-based Computing
15.1. Introduction
15.2. Applications of Biomolecular Motors in Computing
15.2.1. Parallel computing using biomolecular motors in nanofabricated networks
15.2.2. Computing using swarm robots prepared from biomolecular motors
15.2.2.1. Design and construction of molecular robots
15.2.2.2. Swarming of molecular robots
15.2.2.3. Controlling the shape morphology of swarms of molecular robots
15.2.2.4. Remote control of the swarming of molecular robots
15.2.2.5. Logical operation of molecular robots
15.2.2.6. Orthogonal swarming of molecular robots
15.3. Conclusions and Future Perspectives
References
Chapter 16. Computing with Square Electromagnetic Pulses
16.1. Introduction
16.2. Background
16.2.1. Brief historical retrospective
16.2.2. Basics of transmission line theory
16.3. Method
16.3.1. Main computing elements: Cross-points
16.3.2. Four-point crossing: Catt’s junction
16.3.3. Scattering matrix approach
16.3.4. Multi-way junctions: Series and parallel
16.3.5. Simulation results
16.3.6. Scaling down geometries
16.3.7. Pulse generation and control requirements
16.4. Future Directions
16.4.1. 3D-based structures
16.4.2. Graph-based computing
16.4.3. More complex signals and materials
16.5. Conclusion
Acknowledgements
References
Chapter 17. Creative Quantum Computing: Inverse FFT Sound Synthesis, Adaptive Sequencing and Musical Composition
17.1. Introduction
17.2. Why Quantum Computer Music?
17.3. Algorithmic Music
17.4. qSyn: Inverse FFT Sound Synthesis
17.5. qSeq: Adaptive Musical Sequencer
17.6. The Composition Zeno
17.7. Concluding Remarks
Appendix
Acknowledgments
References
Chapter 18. Logical Gates in Natural Erosion of Sandstone
18.1. Introduction
18.2. Modeling of Natural Erosion
18.3. Specific Parameters
18.4. Gates
18.4.1. AND gate
18.4.2. XOR gate
18.4.3. One-bit half-adder
18.5. Discussion
Supplementary Materials
References
Chapter 19. A Case of Toy Computing Implementing Digital Logics with “Minecraft”
19.1. Introduction
19.2. History and Theory of Toy Computing
19.3. Digital Logics in “Minecraft”
19.3.1. Signal generation and transportation
19.3.2. NAND gates
19.4. Digital Circuits in “Minecraft” and the Remains of Electronics
19.4.1. The circuit
19.4.2. The 4-bit full adder build with TTL-logic
19.4.3. The 4-bit full adder build in “Minecraft”
19.5. Project Discussion
19.5.1. Building a BCD decoder in “Minecraft”
19.5.2. The computational labyrinth
19.5.3. Walking the way of the signal
19.6. Summary
Chapter 20. Evolving Conductive Polymer Neural Networks on Wetware
20.1. Introduction
20.2. Introduction of Polymer Neural Networks
20.3. Electropolymerisation of Conducting Polymer Wire
20.3.1. Polymer wire growth
20.3.2. Wire diameter and growth rate
20.3.3. Conductance increase of wires
20.3.4. Directional growth of wire and 3D growth
20.4. Machine Learning for Polymer Wire Growth
20.4.1. Supervised learning: Simple perceptron for simple logic gates
20.4.2. Unsupervised learning: Autoencoder for feature extraction
20.5. Conclusions
References
Index