Mathematical Geosciences: Hybrid Symbolic-Numeric Methods

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This second edition of Mathematical Geosciences book adds five new topics: Solution equations with uncertainty, which proposes two novel methods for solving nonlinear geodetic equations as stochastic variables when the parameters of these equations have uncertainty characterized by probability distribution. The first method, an algebraic technique, partly employs symbolic computations and is applicable to polynomial systems having different uncertainty distributions of the parameters. The second method, a numerical technique, uses stochastic differential equation in Ito form; Nature Inspired Global Optimization where Meta-heuristic algorithms are based on natural phenomenon such as Particle Swarm Optimization. This approach simulates, e.g., schools of fish or flocks of birds, and is extended through discussion of geodetic applications. Black Hole Algorithm, which is based on the black hole phenomena is added and a new variant of the algorithm code is introduced and illustrated based on examples; The application of the Gröbner Basis to integer programming based on numeric symbolic computation is introduced and illustrated by solving some standard problems; An extension of the applications of integer programming solving phase ambiguity in Global Navigation Satellite Systems (GNSSs) is considered as a global quadratic mixed integer programming task, which can be transformed into a pure integer problem with a given digit of accuracy. Three alternative algorithms are suggested, two of which are based on local and global linearization via McCormic Envelopes; and Machine learning techniques (MLT) that offer effective tools for stochastic process modelling. The Stochastic Modelling section is extended by the stochastic modelling via MLT and their effectiveness is compared with that of the modelling via stochastic differential equations (SDE). Mixing MLT with SDE also known as frequently Neural Differential Equations is also introduced and illustrated by an image classification via a regression problem.

Author(s): Joseph L. Awange, Béla Paláncz, Robert H. Lewis, Lajos Völgyesi
Edition: 2
Publisher: Springer
Year: 2023

Language: English
Pages: 732
City: Cham

Foreword to the First Edition
Preface to the Second Edition
Preface to the First Edition
Introduction
Numeric and Symbolic Methods—What Are They?
Numeric Solution
Symbolic Solution
Hybrid (Symbolic-Numeric) Solution
Contents
Part I Solution of Nonlinear Systems
1 Solution of Algebraic Polynomial Systems
1.1 Zeros of Polynomial Systems
1.2 Resultant Methods
1.2.1 Sylvester Resultant
1.2.2 Dixon Resultant
1.3 Göbner Basis
1.3.1 Greatest Common Divisor of Polynomials
1.3.2 Reduced Gröbner Basis
1.3.3 Polynomials with Inexact Coefficients
1.4 Using Dixon-EDF for Symbolic Solution of Polynomial Systems
1.4.1 Explanation of Dixon-EDF
1.4.2 Distance from a Point to a Standard Ellipsoid
1.4.3 Distance from a Point to Any 3D Conic
1.4.4 Pose Estimation
1.4.5 How to Run Dixon-EDF
1.5 Applications
1.5.1 Common Points of Geometrical Objects
1.5.2 Nonlinear Heat Transfer
1.5.3 Helmert Transformation
1.6 Exercises
1.6.1 Solving a System with Different Techniques
1.6.2 Planar Ranging
1.6.3 3D Resection
1.6.4 Pose Estimation
References
2 Homotopy Solution of Nonlinear Systems
2.1 The Concept of Homotopy
2.2 Solving Nonlinear Equation via Homotopy
2.3 Tracing Homotopy Path as Initial Value Problem
2.4 Types of Linear Homotopy
2.4.1 General Linear Homotopy
2.4.2 Fixed-Point Homotopy
2.4.3 Newton Homotopy
2.4.4 Affine Homotopy
2.4.5 Mixed Homotopy
2.5 Regularization of the Homotopy Function
2.6 Start System in Case of Algebraic Polynomial Systems
2.7 Homotopy Methods in Mathematica
2.8 Parallel Computation
2.9 General Nonlinear System
2.10 Nonlinear Homotopy
2.10.1 Quadratic Bezier Homotopy Function
2.10.2 Implementation in Mathematica
2.10.3 Comparing Linear and Quadratic Homotopy
2.11 Applications
2.11.1 Nonlinear Heat Conduction
2.11.2 Local Coordinates via GNSS
2.12 Applications
2.12.1 GNSS Positioning N-Point Problem
References
3 Overdetermined and Underdetermined Systems
3.1 Concept of the Over and Underdetermined Systems
3.1.1 Overdetermined Systems
3.1.2 Underdetermined Systems
3.2 Gauss–Jacobi Combinatorial Solution
3.3 Gauss–Jacobi Solution in Case of Nonlinear Systems
3.4 Transforming Overdetermined System into a Determined System
3.5 Extended Newton–Raphson Method
3.6 Solution of Underdetermined Systems
3.6.1 Direct Minimization
3.6.2 Method of Lagrange Multipliers
3.6.3 Method of Penalty Function
3.6.4 Extended Newton–Raphson
3.7 Applications
3.7.1 Geodetic Application—The Minimum Distance Problem
3.7.2 Global Navigation Satellite System (GNSS) Application
3.7.3 Geometric Application
3.8 Exercises
3.8.1 Solution of Overdetermined System
3.8.2 Solution of Underdetermined System
4 Nonlinear Geodetic Equations with Uncertainties: Algebraic-Numeric Solutions
4.1 Introductory Remarks
4.2 Nonlinear System of Equations with Uncertainties
4.2.1 Problem Definition
4.2.1.1 Algebraic Solution
4.2.1.2 Employing Stochastic Homotopy
4.2.2 Systems of Equations
4.2.2.1 Algebraic Solution
4.2.2.2 Stochastic Homotopy
4.2.2.3 Solution of Equations Simultaneously
4.2.3 Special Cases
4.2.3.1 Different Types of Uncertainties
4.2.3.2 Transcendental Equations
4.2.3.3 Solution of Equations Simultaneously
4.3 Geodetic Examples
4.3.1 Planar Ranging
4.3.1.1 Algebraic Solution
4.3.1.2 Stochastic Homotopy
4.3.2 3D Resection
4.3.3 Ranging by Global Navigation Satellite Systems (GNSS)
4.3.3.1 Observation Equations
4.3.3.2 Algebraic Solution
4.3.3.3 Stochastic Homotopy
4.4 Overdetermined Systems
4.4.1 Problem Definitions
4.4.2 Transforming an Overdetermined System into a Determined One
4.4.2.1 Algebraic Solution
4.4.2.2 Stochastic Homotopy
4.4.3 Gauss-Jacobi Approach
4.4.3.1 Solution of the Subsystems
4.4.3.2 The Averaged Solution of the Subsystems
4.5 Positioning by 3D-Resection
4.5.1 Introduction
4.5.2 Algebraic Solution
4.5.3 Stochastic Homotopy Solution
4.6 Ranging by GNSS
4.6.1 Introductory Remarks
4.6.2 Solution for Implicit Error Definition
4.6.3 Solution for Explicit Error Definition
4.7 GPS Meteorology—Bending Angles
4.7.1 Introduction
4.7.2 Algebraic Solution
4.7.3 Solution via Stochastic Homotopy
References
Part II Optimization of Systems
5 Simulated Annealing
5.1 Metropolis Algorithm
5.2 Realization of the Metropolis Algorithm
5.2.1 Representation of a State
5.2.2 The Free Energy of a State
5.2.3 Perturbation of a State
5.2.4 Accepting a New State
5.2.5 Implementation of the Algorithm
5.3 Algorithm of the Simulated Annealing
5.4 Inplementation of the Algorithm
5.5 Application to Computing Minimum of a Real Function
5.6 Generalization of the Algorithm
5.7 Applications
5.7.1 A Packing Problem
5.7.2 The Traveling Salesman Problem
5.8 Exercise
6 Genetic Algorithms
6.1 The Genetic Evolution Concept
6.2 Mutation of the Best Individual
6.3 Solving a Puzzle
6.4 Application to a Real Function
6.5 Employing Sexual Reproduction
6.5.1 Selection of Parents
6.5.2 Sexual Reproduction: Crossover and Mutation
6.6 The Basic Genetic Algorithm (BGA)
6.7 Applications
6.7.1 Nonlinear Parameter Estimation
6.7.2 Packing Spheres with Different Sizes
6.7.3 Finding All the Real Solutions of a Non-Algebraic System
6.8 Exercise
6.8.1 Foxhole Problem
References
7 Nature Inspired Global Optimization
7.1 Particle Swarm Optimization
7.1.1 Global Optimization
7.1.2 Particle Swarm Optimization
7.1.3 Optimization: Definitions and Basic Algorithm
7.1.4 Illustrative Example
7.1.5 Variants of PSO
7.1.6 PSO Applications in Geosciences
7.2 Black Hole Optimization
7.2.1 Black Hole Algorithm
7.2.2 Code for the 1D Algorithm
7.2.3 Code for the 2D Algorithm
7.3 Test Examples
7.3.1 Example 1
7.3.1.1 Simulated Annealing
7.3.1.2 Differential Evolution
7.3.1.3 Random Search
7.3.1.4 Black Hole
7.3.2 Example 2
7.3.2.1 Simulated Annealing
7.3.2.2 Differential Evolution
7.3.2.3 Random Search
7.3.2.4 Black Hole
7.3.3 Example 3
7.3.3.1 Simulated Annealing
7.3.3.2 Differential Evolution
7.3.3.3 Random Search
7.3.3.4 Black Hole
7.3.4 Example 4
7.3.4.1 Simulated Annealing
7.3.4.2 Differential Evolution
7.3.4.3 Random Search
7.3.4.4 Black Hole
References
8 Integer Programing
8.1 The Integer Problem
8.2 Discrete Value Problems
8.3 Simple Logical Conditions
8.4 Some Typical Problems of Binary Programming
8.4.1 Knapsack Problem
8.4.2 Nonlinear Knapsack Problem
8.4.3 Set-Covering Problem
8.5 Solution Methods
8.5.1 Binary Countdown Method
8.5.2 Branch and Bound Method
8.5.3 Application of Gröbner Basis
8.6 Mixed-Integer Programming
8.7 Applications
8.7.1 Integer Least Squares
8.7.2 Optimal Number of Oil Wells
8.7.3 Solution of GNSS Phase Ambiguity
8.7.3.1 Introduction
8.7.3.2 Three Methods to Solve Integer Programming
8.7.3.3 Mixed Integer Programming
8.7.3.4 Computing the Next Best Integer Solution
8.7.3.5 Solution of GNSS Phase Ambiguity Problem
8.7.3.6 Concluding Remarks
8.8 Exercises
8.8.1 Study of Mixed Integer Programming
8.8.2 Mixed Integer Least Square
References
9 Multiobjective Optimization
9.1 Concept of Multiobjective Problem
9.1.1 Problem Definition
9.1.2 Interpretation of the Solution
9.2 Pareto Optimum
9.2.1 Nonlinear Problems
9.2.2 Pareto-Front and Pareto-Set
9.3 Computation of Pareto Optimum
9.3.1 Pareto Filter
9.3.2 Reducing the Problem to the Case of a Single Objective
9.3.3 Weighted Objective Functions
9.3.4 Ideal Point in the Function Space
9.3.5 Pareto Balanced Optimum
9.3.6 Non-Convex Pareto-Front
9.4 Employing Genetic Algorithms
9.5 Application
9.5.1 Nonlinear Gauss-Helmert Model
9.6 Exercise
References
Part III Approximation of Functions and Data
10 Approximation with Radial Bases Functions
10.1 Basic Idea of RBF Interpolation
10.2 Positive Definite RBF Function
10.3 Compactly Supported Functions
10.4 Some Positive Definite RBF Function
10.4.1 Laguerre–Gauss Function
10.4.2 Generalized Multi-quadratic RBF
10.4.3 Wendland Function
10.4.4 Buchmann-Type RBF
10.5 Generic Derivatives of RBF Functions
10.6 Least Squares Approximation with RBF
10.7 Applications
10.7.1 Image Compression
10.7.2 RBF Collocation Solution of Partial Differential Equation
10.8 Exercise
10.8.1 Nonlinear Heat Transfer
References
11 Support Vector Machines (SVM)
11.1 Concept of Machine Learning
11.2 Optimal Hyperplane Classifier
11.2.1 Linear Separability
11.2.2 Computation of the Optimal Parameters
11.2.3 Dual Optimization Problem
11.3 Nonlinear Separability
11.4 Feature Spaces and Kernels
11.5 Application of the Algorithm
11.5.1 Computation Step by Step
11.5.2 Implementation of the Algorithm
11.6 Two Nonlinear Test Problems
11.6.1 Learning a Chess Board
11.6.2 Two Intertwined Spirals
11.7 Concept of SVM Regression
11.7.1 ε-Insensitive Loss Function
11.7.2 Concept of the Support Vector Machine Regression (SVMR)
11.7.3 The Algorithm of the SVMR
11.8 Employing Different Kernels
11.8.1 Gaussian Kernel
11.8.2 Polynomial Kernel
11.8.3 Wavelet Kernel
11.8.4 Universal Fourier Kernel
11.9 Applications
11.9.1 Image Classification
11.9.2 Maximum Flooding Level
11.10 Exercise
11.10.1 Noise Filtration
References
12 Symbolic Regression
12.1 Concept of Symbolic Regression
12.2 Problem of Kepler
12.2.1 Polynomial Regression
12.2.2 Neural Network
12.2.3 Support Vector Machine Regression
12.2.4 RBF Interpolation
12.2.5 Random Models
12.2.6 Symbolic Regression
12.3 Applications
12.3.1 Correcting Gravimetric Geoid Using GPS Ellipsoidal Heights
12.3.1.1 Dataset for the Numerical Computations
12.3.1.2 Parametric Models
12.3.1.3 Artificial Neural Network (ANN) Models
12.3.1.4 Application of Symbolic Regression
12.3.2 Geometric Transformation
12.3.2.1 Similarity Transformation
12.3.2.2 Affine Transformation
12.3.2.3 Projective Transformation
12.4 Exercise
12.4.1 Bremerton Data
12.4.1.1 Random Models
12.4.1.2 Keijzer Models
12.4.1.3 Symbolic Regression
References
13 Quantile Regression
13.1 Problems with the Ordinary Least Squares
13.1.1 Correlation Height and Age
13.1.2 Engel’s Problem
13.2 Concept of Quantile
13.2.1 Quantile as a Generalization of Median
13.2.2 Quantile for Probability Distributions
13.3 Linear Quantile Regression
13.3.1 Ordinary Least Square (OLS)
13.3.2 Median Regression (MR)
13.3.3 Quantile Regression (QR)
13.4 Computing Quantile Regression
13.4.1 Quantile Regression via Linear Programming
13.4.2 Boscovich’s Problem
13.4.3 Extension to Linear Combination of Nonlinear Functions
13.4.4 B-Spline Application
13.5 Applications
13.5.1 Separate Outliers in Cloud Points
13.5.2 Modelling Time-Series
13.6 Exercise
13.6.1 Regression of Implicit-Functions
References
14 Robust Regression
14.1 Basic Methods in Robust Regression
14.1.1 Concept of Robust Regression
14.1.2 Maximum Likelihood Method
14.1.2.1 Symbolic Solution
14.1.2.2 Solution Via Numerical Gröbner Basis
14.1.3 Danish Algorithm
14.1.3.1 Basic Idea
14.1.3.2 Danish Algorithm with Gröbner Basis
14.1.4 Danish Algorithm with PCA
14.1.5 RANSAC Algorithm
14.1.5.1 Basic Idea of RANSAC
14.1.5.2 The Necessary Number of Iterations
14.1.5.3 Threshold
14.1.5.4 Computation of the Method Step by Step
14.1.5.5 RANSAC with Gröbner Basis
14.1.5.6 Application the RANSAC Method
14.1.5.7 Application of Self-Organizing Map (SOM) to the RANSAC Algorithm
14.2 Application Examples
14.2.1 Fitting a Sphere to Point Cloud Data
14.2.1.1 Algebraic Least Square When R is Unknown
14.2.1.2 Algebraic Least Square When R is Known
14.2.1.3 Geometric Fitting When R is Unknown
14.2.1.4 Geometric Fitting When R is Known
14.2.1.5 Directional Fitting
14.2.1.6 Application Self-Organizing Map to Reducing Data
14.2.1.7 Symbolic Computation of Gröbner Basis for Geometric Fitting
14.2.1.8 Geometric Fitting Via Gauss-Jacobi Method with SOM Data
14.2.1.9 Application of RANdom SAmple Consensus (RANSAC)
14.2.2 Fitting a Cylinder
14.2.2.1 Vector Algebraic Definition
14.2.2.2 Parametrized Form of the Cylinder Equation
14.2.2.3 Implicit Equation from the Parametric One
14.2.2.4 Computing Model Parameters
14.2.2.5 Computing Model Parameters in Overdetermined Case
14.2.2.6 Application to Estimation of Tree Stem Diameter
14.2.2.7 Expectation Maximization
14.2.2.8 Maximum Likelihood for Gaussian Mixture
14.2.2.9 Application to Leafy Tree
14.3 Problem
14.3.1 Fitting a Plane to a Slope
14.3.1.1 Test Area, Equipment and Measured Data
14.3.1.2 Application SVD
14.3.1.3 Solution Via PCA
14.3.1.4 Gröbner Basis Solution
14.3.1.5 Model Error Analysis
14.3.1.6 Application Danish Algorithm with Embedded Gröbner Basis Solution
14.3.1.7 Model Error Analysis
14.3.1.8 Application Danish Method with Embedded Weighted PCA (PCAW)
14.3.1.9 Model Error Analysis
References
15 Stochastic Modeling
15.1 Basic Stochastic Processes
15.1.1 Concept of Stochastic Processes
15.1.2 Examples for Stochastic Processes
15.1.3 Features of Stochastic Processes
15.2 Time Series
15.2.1 Concept of Time Series
15.2.2 Models of Time Series
15.3 Stochastic Differential Equations (SDE)
15.3.1 Ito Process
15.3.2 Ito Numerical Integral
15.3.3 Euler–Maruyama Method
15.4 Numerical Solution of (SDE)
15.4.1 Single Realization
15.4.2 Many Realizations
15.4.3 Slice Distribution
15.4.4 Standard Error Band
15.5 Parameter Estimation
15.5.1 Measurement Values
15.5.2 Likelihood Function
15.5.3 Maximization of the Likelihood Function
15.5.4 Simulation with the Estimated Parameters
15.5.5 Deterministic Versus Stochastic Modeling
15.6 Application of Machine Learning
15.6.1 Applying Machine Learning
15.6.1.1 Input–Output Data Pairs
15.6.1.2 Training Process
15.6.1.3 Using ML Method for Simulation
15.6.2 Machine Learning Differential Equation Model
15.6.2.1 Employing Stochastic Differential Equation Form
15.6.2.2 Other Hybrid Solution Methods
15.6.3 Comparing the Different Stochastic Modeling Methods
15.6.4 Image Classification
15.6.4.1 Vacant Land and Residential Areas
15.6.4.2 Classification
15.6.4.3 Quality of the Classification
15.6.4.4 Optimization
15.6.4.5 Simulated Annealing
15.6.4.6 Differential Evolution
15.6.4.7 Random Search
15.6.4.8 Black Hole
15.6.5 Regression
15.6.5.1 The Function Approximation in Form of the Neural Network Differential Equation
15.6.5.2 Employing Stochastic Differential Equation Form
16 Parallel Computation
16.1 Introduction
16.2 Amdahl’s Law
16.3 Implicit and Explicit Parallelism
16.4 Dispatching Tasks
16.5 Balancing Loads
16.6 Parallel Computing with GPU
16.6.1 Neural Network Computing with GPU
16.6.1.1 Function Approximation
16.6.1.2 Classification Problem via Deep Learning
16.6.2 Image Processing with GPU
16.7 Applications
16.7.1 3D Ranging Using the Dixon Resultant
16.7.2 Reducing Colors via Color Approximation
16.8 Problem
16.8.1 Photogrammetric Positioning by Gauss–Jacobi Method
Reference