Learning with Fractional Orthogonal Kernel Classifiers in Support Vector Machines: Theory, Algorithms and Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book contains select chapters on support vector algorithms from different perspectives, including mathematical background, properties of various kernel functions, and several applications. The main focus of this book is on orthogonal kernel functions, and the properties of the classical kernel functions―Chebyshev, Legendre, Gegenbauer, and Jacobi―are reviewed in some chapters. Moreover, the fractional form of these kernel functions is introduced in the same chapters, and for ease of use for these kernel functions, a tutorial on a Python package named ORSVM is presented. The book also exhibits a variety of applications for support vector algorithms, and in addition to the classification, these algorithms along with the introduced kernel functions are utilized for solving ordinary, partial, integro, and fractional differential equations.

On the other hand, nowadays, the real-time and big data applications of support vector algorithms are growing. Consequently, the Compute Unified Device Architecture (CUDA) parallelizing the procedure of support vector algorithms based on orthogonal kernel functions is presented. The book sheds light on how to use support vector algorithms based on orthogonal kernel functions in different situations and gives a significant perspective to all machine learning and scientific machine learning researchers all around the world to utilize fractional orthogonal kernel functions in their pattern recognition or scientific computing problems.

Author(s): Jamal Amani Rad (editor), Kourosh Parand (editor), Snehashish Chakraverty (editor)
Series: Industrial and Applied Mathematics
Edition: 1
Publisher: Springer
Year: 2023

Language: English
Commentary: LCC nails it with bullseyes sight. DDC does it around LCC precision. UDC and LBC do it around DDC.
Pages: 319
City: Singapore
Tags: Support Vector Machine; Kernel; Fractional; Orthogonal Function; Pattern Recognition; Machine Learning

Preface
Contents
Editors and Contributors
Part I Basics of Support Vector Machines
1 Introduction to SVM
1.1 What Is Machine Learning?
1.1.1 Classification of Machine Learning Techniques
1.2 What Is the Pattern?
1.3 An Introduction to SVM with a Geometric Interpretation
1.4 History of SVMs
1.5 SVM Applications
References
2 Basics of SVM Method and Least Squares SVM
2.1 Linear SVM Classifiers
2.1.1 Hard Margin SVM
2.1.2 Soft Margin SVM
2.2 Nonlinear SVM Classifiers
2.2.1 Kernel Trick and Mercer Condition
2.3 SVM Regressors
2.4 LS-SVM Classifiers
2.5 LS-SVM Regressors
References
Part II Special Kernel Classifiers
3 Fractional Chebyshev Kernel Functions: Theory and Application
3.1 Introduction
3.2 Preliminaries
3.2.1 Properties of Chebyshev Polynomials
3.2.2 Properties of Fractional Chebyshev Functions
3.3 Chebyshev Kernel Functions
3.3.1 Ordinary Chebyshev Kernel Function
3.3.2 Other Chebyshev Kernel Functions
3.3.3 Fractional Chebyshev Kernel
3.4 Application of Chebyshev Kernel Functions on Real Datasets
3.4.1 Spiral Dataset
3.4.2 Three Monks' Dataset
3.5 Conclusion
References
4 Fractional Legendre Kernel Functions: Theory and Application
4.1 Introduction
4.2 Preliminaries
4.2.1 Properties of Legendre Polynomials
4.2.2 Properties of Fractional Legendre Functions
4.3 Legendre Kernel Functions
4.3.1 Ordinary Legendre Kernel Function
4.3.2 Other Legendre Kernel Functions
4.3.3 Fractional Legendre Kernel
4.4 Application of Legendre Kernel Functions on Real Datasets
4.4.1 Spiral Dataset
4.4.2 Three Monks' Dataset
4.5 Conclusion
References
5 Fractional Gegenbauer Kernel Functions: Theory and Application
5.1 Introduction
5.2 Preliminaries
5.2.1 Properties of Gegenbauer Polynomials
5.2.2 Properties of Fractional Gegenbauer Polynomials
5.3 Gegenbauer Kernel Functions
5.3.1 Ordinary Gegenbauer Kernel Function
5.3.2 Validation of Gegenbauer Kernel Function
5.3.3 Other Gegenbauer Kernel Functions
5.3.4 Fractional Gegenbauer Kernel Function
5.4 Application of Gegenbauer Kernel Functions on Real Datasets
5.4.1 Spiral Dataset
5.4.2 Three Monks' Dataset
5.5 Conclusion
References
6 Fractional Jacobi Kernel Functions: Theory and Application
6.1 Introduction
6.2 Preliminaries
6.2.1 Properties of Jacobi Polynomials
6.2.2 Properties of Fractional Jacobi Functions
6.3 Jacobi Kernel Functions
6.3.1 Ordinary Jacobi Kernel Function
6.3.2 Other Jacobi Kernel Functions
6.3.3 Fractional Jacobi Kernel
6.4 Application of Jacobi Kernel Functions on Real Datasets
6.4.1 Spiral Dataset
6.4.2 Three Monks' Dataset
6.5 Summary and Conclusion
References
Part III Applications of Orthogonal Kernels
7 Solving Ordinary Differential Equations by LS-SVM
7.1 Introduction
7.2 LS-SVM Formulation
7.2.1 Collocation Form of LS-SVM
7.3 Rational Legendre Kernels
7.4 Collocation Form of LS-SVM for Lane-Emden Type Equations
7.5 Numerical Examples
7.6 Conclusion
References
8 Solving Partial Differential Equations by LS-SVM
8.1 Introduction
8.2 LS-SVM Method for Solving Second-Order Partial Differential Equations
8.2.1 Temporal Discretization
8.2.2 LS-SVM Collocation Method
8.3 Numerical Simulations
8.3.1 Fokker–Planck Equation
8.3.2 Generalized Fitzhugh–Nagumo Equation
8.4 Conclusion
References
9 Solving Integral Equations by LS-SVR
9.1 Introduction
9.2 Integral Equations
9.2.1 Fredholm Integral Equations
9.2.2 Volterra Integral Equations
9.2.3 Volterra-Fredholm Integral Equations
9.2.4 Integro-Differential Equations
9.2.5 Multi-dimensional Integral Equations
9.2.6 System of Integral Equations
9.3 LS-SVR for Solving IEs
9.3.1 One-Dimensional Case
9.3.2 Multi-dimensional Case
9.3.3 System of Integral Equations
9.3.4 CLS-SVR Method
9.3.5 GLS-SVR Method
9.4 Numerical Simulations
9.5 Conclusion
References
10 Solving Distributed-Order Fractional Equations by LS-SVR
10.1 Introduction
10.1.1 A Brief Review of Other Methods Existing in the Literature
10.2 Preliminaries
10.2.1 Fractional Derivative
10.2.2 Numerical Integration
10.3 LS-SVR Method for Solving Distributed-Order Fractional Differential Equations
10.4 Numerical Results and Discussion
10.4.1 Test Problem 1
10.4.2 Test Problem 2
10.4.3 Test Problem 3
10.4.4 Test Problem 4
10.4.5 Test Problem 5
10.5 Conclusion
References
Part IV Orthogonal Kernels in Action
11 GPU Acceleration of LS-SVM, Based on Fractional Orthogonal Functions
11.1 Parallel Processing
11.2 GPU Architecture
11.2.1 CUDA Programming with Python
11.3 Analyzing Codes and Functions
11.3.1 Analyzing the Training Function
11.3.2 Analyzing the Test Function
11.4 Hardware and Software Requirements
11.5 Accelerating the Chebyshev Kernel
11.6 More Optimizations
11.7 Accelerating the QP Solver
11.8 Conclusion
References
12 Classification Using Orthogonal Kernel Functions: Tutorial on ORSVM Package
12.1 Introduction
12.1.1 ORSVM
12.2 How to Install
12.3 Model Class
12.4 SVM Class
12.4.1 Chebyshev Class
12.4.2 Legendre Class
12.4.3 Gegenbauer Class
12.4.4 Jacobi Class
12.5 Transformation Function
12.6 How to Use
Appendix Appendix: Python Programming Prerequisite
A.1 Introduction
A.2 Basics of Python
How to Use Python?
Python Basics
Basic Syntax
Comments
Variable Types
Numbers and Casting
Strings
Lists
Dictionary
If ... Else
While Loops
For Loops
Range, Break, and Continue
Try, Expect
Functions
Libraries
A.3 Pandas
A.4 Numpy
A.5 Matplotlib
Pyplot
Formatting the Style of Your Plot
Plotting with Keyword Strings
Plotting with Categorical Variables