This book introduces the concept of “bespoke learning”, a new mechanistic approach that makes it possible to generate values of an output variable at each designated value of an associated input variable. Here the output variable generally provides information about the system’s behaviour/structure, and the aim is to learn the input-output relationship, even though little to no information on the output is available, as in multiple real-world problems. Once the output values have been bespoke-learnt, the originally-absent training set of input-output pairs becomes available, so that (supervised) learning of the sought inter-variable relation is then possible. Three ways of undertaking such bespoke learning are offered: by tapping into system dynamics in generic dynamical systems, to learn the function that causes the system’s evolution; by comparing realisations of a random graph variable, given multivariate time series datasets of disparate temporal coverage; and by designing maximally information-availing likelihoods in static systems. These methodologies are applied to four different real-world problems: forecasting daily COVID-19 infection numbers; learning the gravitational mass density in a real galaxy; learning a sub-surface material density function; and predicting the risk of onset of a disease following bone marrow transplants. Primarily aimed at graduate and postgraduate students studying a field which includes facets of statistical learning, the book will also benefit experts working in a wide range of applications. The prerequisites are undergraduate level probability and stochastic processes, and preliminary ideas on Bayesian statistics.
Author(s): Dalia Chakrabarty
Publisher: Springer
Year: 2023
Language: English
Pages: 240
City: Cham
Foreword
Preface
Acknowledgements
Contents
1 Bespoke Learning to Generate Originally-Absent Training Data
1.1 Introduction
1.1.1 Some Definitions
1.2 Prediction Notwithstanding Unavailability Of Training: Real-world Examples
1.2.1 Gravitational Mass Density in Galaxies
1.2.2 Composition of Rocks in Petrophysics
1.2.3 Temporally-Evolving Systems
1.3 Relevant Ambition
1.4 Bespoke Learning
1.4.1 Training Data Is Absent: Not that Training Data Points Are Missing
1.5 A Wide-Angled View: Bespoke Learning in the Brain?
1.6 Summary
References
2 Learning the Temporally-Evolving Evolution-Driving Function of a Dynamical System, to Forecast Future States: Forecasting New COVID19 Infection Numbers
2.1 Introduction
2.1.1 Time Series Modelling
2.1.1.1 ARIMA, etc.
2.1.1.2 Empirical Dynamical Models
2.1.1.3 Our New Approach and EDM
2.1.2 Why Seek Potential?
2.1.3 Hidden Markov Models and Our Approach
2.1.4 Markov Decision Processes; Reinforcement Learning and Our Approach
2.1.5 A New Way to Forecast: Learn the Evolution-Driving Function
2.1.6 Evolution-Driver, Aka Potential Function
2.2 Learning Scheme: Outline and 3 Underlining Steps
2.2.1 Can the Potential be Learnt Directly Using Observed Phase Space Variables?
2.3 Robustness of Prediction: Extra Information from the 2nd Law
2.4 3-Staged Algorithm
2.4.1 Outline of Bespoke Learning in Step I
2.4.2 Embedding Potential into Support of Phase Space pdf: Part of Step I
2.4.3 Learning Potential as Modelled with a Gaussian Process: Part of Step II
2.4.4 Rate and Location Variable Computation: Part of Step III
2.5 Collating All Notation
2.6 Details of the Potential Learning
2.6.1 Likelihood
2.6.2 Likelihood Given Vectorised Phase Space pdf
2.6.3 Prior, Posterior and MCMC-Based Inference
2.7 Predicting Potential at Test Time: Motivation
2.7.1 Learning the Generative Process Underlying Temporal Variation of Potential, Following Bespoke Potential Learning
2.7.1.1 Closed-Form Prediction at a New Time Window
2.7.1.2 Errors of Forecasting
2.7.1.3 Advantage of Our Learning Strategy: Reviewed
2.7.2 Improved Modelling of Σcompo?
2.8 Illustration: Forecasting New Rate and Number of COVID19 Infections
2.8.1 Data
2.8.2 What Is ``Potential'' in This Application?
2.9 Negative Forecast Potential, etc.
2.9.1 Implementing the 3-Step Learning+Forecasting in this Application
2.9.2 Few Technical Issues to Note in This Empirical Illustration
2.9.3 Collating the Background
2.9.4 Bespoke Learning of Potential: Results from Steps I and III
2.9.5 Forecasting: Results from Steps II and III
2.9.6 Quality of Forecasting
2.9.6.1 How Far from the Mean?
2.9.6.2 Information Redundancy and Forecasting at the 8-th Time Window
2.9.6.3 Permutation Entropy
2.9.6.4 Why Is the Forecast Bad Around the 2020–2021 Transition?
2.9.7 Work to Be Done
2.10 Summary
References
3 Potential to Density via Poisson Equation: Application to Bespoke Learning of Gravitational Mass Density in Real Galaxy
3.1 Introduction
3.1.1 Motivating Bespoke Learning
3.1.2 A Particularly Difficult Data Set for Bespoke Learning
3.2 Methodology
3.2.1 Potential and Phase Space Probability Density Function
3.2.2 Phase Flow and Boltzmann Equation
3.2.3 So Far this Is What We Know
3.2.4 Why Consider System Potential to be Velocity Independent?
3.2.5 Relevant Abstraction
3.2.6 Centrality of the Potential
3.2.7 Half the Phase Space Coordinates Cannot be Observed
3.2.8 Why Include Only Energy to Recast Support of fW(·)?
3.2.9 Probability of Data
3.2.10 Isotropy
3.2.11 Likelihood, Including Acknowledgement of Measurement Uncertainties
3.2.12 Ingredients of Inference, Following Likelihood Identification
3.2.13 In the Absence of Training Data
3.2.14 ρ and f Vectors
3.2.15 Computing Potential at a Given R, Given Vectorised Gravitational Mass Density
3.2.16 Model in Light of Vectorised Potential and Phase Space Pdf
3.2.17 Convolving with Error Density
3.2.18 Priors
3.2.19 Inference on f and ρ
3.2.20 How Many Energy Partitions? How Many Radial Partitions?
3.2.20.1 Binning the Radial Range
3.2.20.2 Binning the Energy Range
3.2.21 Inference Using MCMC
3.2.22 Wrapping Up Methodology
3.3 Empirical Illustration on Real Galaxy NGC4649
3.3.1 Learning the ρ(R) Function and pdf f()
3.3.1.1 Predicting from the Learnt ρ(·)
3.3.1.2 Gravitational Mass Enclosed Within a Radius
3.3.1.3 Details of Learning and Prediction
3.3.1.4 Predicting Upon Learning the Phase Space pdf
3.4 Conclusion
3.4.1 Testing for Isotropy in the Data
3.4.2 Working with a Multivariate Phase Space pdf
3.4.3 Summary
References
4 Bespoke Learning in Static Systems: Application to Learning Sub-surface Material Density Function
4.1 Introduction
4.2 Bespoke Learning, Followed by 2-Staged Supervised Learning
4.2.1 Bayesian Implementation of Bespoke Learning
4.2.1.1 Learning Y at Given Values of X, Using Data on W
4.2.1.2 How Can W Inform on the Sought Y?
4.2.1.3 Motivating a Model for the Likelihood
4.2.1.4 Posterior and Inference from It
4.2.2 What if a Different Likelihood?
4.2.3 Dual-Staged Supervised Learning of g(X)(=Y)
4.3 Application to Materials Science
4.3.1 Generating the Originally-Absent Training Data
4.3.2 Underlying Stochastic Process
4.3.3 Details of Existing Bespoke Learning
4.4 Learning Correlation of Underlying GP and Predicting Sub-surface Material Density
4.4.1 Non-parametric Kernel Parametrisation
4.4.2 Predictions
4.4.3 Forecasting
4.5 Conclusions
References
5 Bespoke Learning of Disease Progression Using Inter-Network Distance: Application to Haematology-Oncology: Joint Work with Dr. Kangrui Wang, Dr. Akash Bhojgaria and Dr. Joydeep Chakrabartty
5.1 Introduction
5.2 Learning Graphical Models and Computing Inter-Graph Distance
5.2.1 Learning the Inter-Variable Correlation Structure of the Data
5.2.2 Learning the SRGG
5.2.3 Inference on the SRGG
5.2.4 Distance Between Graphical Models
5.3 Learning Relative Score Parameters Using Learnt Graphical Models
5.3.1 Details of Score Computation
5.4 Application to Haematology-Oncology
5.4.1 Learning of the Relation Between VOD-Score and Pre-transplant Variables and Prediction
5.4.2 Which Pre-transplant Factors Affect SOS/VOD Progression Most?
5.5 Summary
References
A Bayesian Inference by Posterior Sampling Using MCMC
A.1 Bayesian Inference by Sampling
A.1.1 How to Sample?
A.1.2 MCMC
A.1.3 Metropolis-Hastings
A.1.4 Gibbs Sampling
A.1.5 Metropolis-Within-Gibbs
Index