Deep learning is an important element of artificial intelligence, especially in applications such as image classification in which various architectures of neural network, e.g., convolutional neural networks, have yielded reliable results. This book introduces deep learning for time series analysis, particularly for cyclic time series. It elaborates on the methods employed for time series analysis at the deep level of their architectures. Cyclic time series usually have special traits that can be employed for better classification performance. These are addressed in the book. Processing cyclic time series is also covered herein.
An important factor in classifying stochastic time series is the structural risk associated with the architecture of classification methods. The book addresses and formulates structural risk, and the learning capacity defined for a classification method. These formulations and the mathematical derivations will help the researchers in understanding the methods and even express their methodologies in an objective mathematical way. The book has been designed as a self-learning textbook for the readers with different backgrounds and understanding levels of machine learning, including students, engineers, researchers, and scientists of this domain. The numerous informative illustrations presented by the book will lead the readers to a deep level of understanding about the deep learning methods for time series analysis.
Author(s): Arash Gharehbaghi
Publisher: CRC Press/Science Publishers
Year: 2023
Language: English
Pages: 207
City: Boca Raton
Cover
Title Page
Copyright Page
Dedication
Foreword
Preface
Table of Contents
Contributors
Part I Fundamentals of Learning
1. Introduction to Learning
1.1 Artificial Intelligence
1.2 Data and Signal Definition
1.3 Data Versus Signal
1.4 Signal Models
1.5 Noise and Interference
1.6 Time Series Definition
1.7 Time Series Analysis
1.8 Deep Learning and Time Series Analysis
1.9 Organisation of the Book
2. Learning Theory
2.1 Learning and Adaptation
2.2 Learning in a Practical Example
2.3 Mathematical View to Learning
2.3.1 Training and Validation Data
2.3.2 Training Method
2.3.3 Training Parameters
2.3.4 Hyperparameters
2.4 Learning Phases
2.5 Training, Validation, and Test
2.6 Learning Schemes
2.6.1 Supervised-Static Learning
2.6.2 Supervised-Dynamic Learning
2.6.3 Unsupervised-Static Learning
2.6.4 Unsupervised-Dynamic Learning
2.7 Training Criteria
2.8 Optimization, Training, and Learning
2.9 Evaluation of Learning Performance
2.9.1 Structural Risk
2.9.2 Empirical Risk
2.9.3 Overfitting and Underfitting Risk
2.9.4 Learning Capacity
2.10 Validation
2.10.1 Repeated Random Sub Sampling (RRSS)
2.10.2 K-Fold Validation
2.10.3 A-Test Validation
2.11 Privileges of A-Test Method
2.11.1 A-Test and Structural Risk
2.11.2 A-Test and Leaning Capacity
2.11.3 A-Test vs other Methods
2.12 Large and Small Training Data
3. Pre-processing and Visualisation
3.1 Dimension Reduction
3.1.1 Feature Selection
3.1.1.1 Hill-Climbing Algorithm
3.1.1.2 Linear Discriminant Analysis (LDA)
3.1.1.3 Fisher Method
3.1.2 Linear Transformation
3.1.2.1 Principal Component Analysis (PCA)
3.1.2.2 PCA-Fisher Method
3.2 Supervised Mapping
3.2.1 K-Nearest Neighbours (KNN)
3.2.2 Perceptron Neural Network
3.2.3 Multi-layer Perceptron Neural Networks (MLP)
3.3 Unsupervised Mapping
3.3.1 K-Means Clustering
3.3.2 Self-Organizing Map (SOM)
3.3.3 Hierarchical Clustering
Part II Essentials of Time Series Analysis
4. Basics of Time Series
4.1 Introduction to Time Series Analysis
4.2 Deterministic, Chaotic and Stochastic
4.3 Stochastic Behaviors of Time Series
4.3.1 Cyclic Time Series
4.3.1.1 Sector Definition
4.3.1.2 Uniform Sectors
4.3.1.3 Growing-Time Sectors
4.3.2 Partially Cyclic Time Series
4.4 Time Series Prediction
4.5 Time Series Classification
5. Multi-Layer Perceptron (MLP) Neural Networks for Time Series Classification
5.1 Time-Delayed Neural Network (TDNN)
5.2 Time-Growing Neural Network (TGNN)
5.3 Forward, Backward and Bilateral Time-Growing Window
5.4 Privileges of Time-Growing Neural Network
5.4.1 TGNN includes MLP in its architecture
5.4.2 TGNN can include TDNN in its structure
5.4.3 TGNN is optimal in learning the first window
6. Dynamic Models for Sequential Data Analysis
6.1 Dynamic Time Warping (Structural Classification)
6.2 Hidden Markov Model (Statistical Classification)
6.2.1 Model-based analysis
6.2.2 Essentials of Hidden Markov Model (HMM)
6.2.3 Problem statement and implementation
6.2.4 Time series analysis and HMM
6.3 Recurrent Neural Network
Part III Deep Learning Approaches to Time Series Classification
7. Clustering for Learning at Deep Level
7.1 Clustering as a Tool for Deep Learning
7.2 Modified K-Means Method
7.3 Modified Fuzzy C-Means
7.4 Discriminant Analysis
7.5 Cluster-Based vs Discriminant Analysis Methods
7.6 Combined Methods
8. Deep Time Growing Neural Network
8.1 Basic Architecture
8.2 Learning at the Deep Level
8.2.1 Learning the growing centre
8.2.2 Learning the deep elements
8.3 Surface Learning
9. Deep Learning of Cyclic Time Series
9.1 Time Growing Neural Network
9.2 Growing-Time Support Vector Machine
9.3 Distance-Based Learning
9.4 Optimization
10. Hybrid Method for Cyclic Time Series
10.1 Learning Deep Contents
10.2 Cyclic Learning
10.3 Classification
11. Recurrent Neural Networks (RNN)
11.1 Introduction
11.2 Structure of Recurrent Neural Networks
11.3 Unfolding the Network in Time
11.4 Backpropagation Through Time
11.5 The Challenge of Long-term Dependencies
11.6 Long-Short Term Memory (LSTM)
11.7 Other Recurrent Networks
11.7.1 Unfolding outputs at all steps
11.7.2 Gated recurrent networks
11.7.3 Echo state networks
12. Convolutional Neural Networks (CNN)
12.1 Introduction
12.2 Architecture Overview
12.3 Convolutional Layer
12.4 Pooling Layer
12.5 Learning of CNN
12.6 Recurrent CNN
Bibliography
Index