In-depth resource covering machine and deep learning methods using MATLAB tools and algorithms, providing insights and algorithmic decision-making processes
Machine and Deep Learning Using MATLAB introduces early career professionals to the power of MATLAB to explore machine and deep learning applications by explaining the relevant MATLAB tool or app and how it is used for a given method or a collection of methods. Its properties, in terms of input and output arguments, are explained, the limitations or applicability is indicated via an accompanied text or a table, and a complete running example is shown with all needed MATLAB command prompt code.
The text also presents the results, in the form of figures or tables, in parallel with the given MATLAB code, and the MATLAB written code can be later used as a template for trying to solve new cases or datasets. Throughout, the text features worked examples in each chapter for self-study with an accompanying website providing solutions and coding samples. Highlighted notes draw the attention of the user to critical points or issues.
Readers will also find information on
Numeric data acquisition and analysis in the form of applying computational algorithms to predict the numeric data patterns (clustering or unsupervised learning)
Relationships between predictors and response variable (supervised), categorically sub-divided into classification (discrete response) and regression (continuous response)
Image acquisition and analysis in the form of applying one of neural networks, and estimating net accuracy, net loss, and/or RMSE for the successive training, validation, and testing steps
Retraining and creation for image labeling, object identification, regression classification, and text recognition
Machine and Deep Learning Using MATLAB is a useful and highly comprehensive resource on the subject for professionals, advanced students, and researchers who have some familiarity with MATLAB and are situated in engineering and scientific fields, who wish to gain mastery over the software and its numerous applications.
Author(s): Kamal I. M. Al-Malah
Publisher: Wiley
Year: 2023
Language: English
Pages: 592
Cover
Title Page
Copyright Page
Table of Contents
Preface
About the Companion Website
1 Unsupervised Machine Learning (ML) Techniques
Introduction
Selection of the Right Algorithm in ML
Classical Multidimensional Scaling of Predictors Data
Principal Component Analysis (PCA)
k-Means Clustering
Distance Metrics: Locations of Cluster Centroids
Replications
Gaussian Mixture Model (GMM) Clustering
Optimum Number of GMM Clusters
Observations and Clusters Visualization
Evaluating Cluster Quality
Silhouette Plots
Hierarchical Clustering
Step 1 – Determine Hierarchical Structure
Step 2 – Divide Hierarchical Tree into Clusters
PCA and Clustering: Wine Quality
Feature Selection Using Laplacian (fsulaplacian) for Unsupervised Learning
CHW 1.1 The Iris Flower Features Data
CHW 1.2 The Ionosphere Data Features
CHW 1.3 The Small Car Data
CHW 1.4 Seeds Features Data
2 ML Supervised Learning: Classification Models
Fitting Data Using Different Classification Models
Customizing a Model
Creating Training and Test Datasets
Predicting the Response
Evaluating the Classification Model
KNN Model for All Categorical or All Numeric Data Type
KNN Model: Heart Disease Numeric Data
Viewing the Fitting Model Properties
The Fitting Model: Number of Neighbors and Weighting Factor
The Cost Penalty of the Fitting Model
KNN Model: Red Wine Data
Using MATLAB Classification Learner
Binary Decision Tree Model for Multiclass Classification of All Data Types
Classification Tree Model: Heart Disease Numeric Data Types
Classification Tree Model: Heart Disease All Predictor Data Types
Naïve Bayes Classification Model for All Data Types
Fitting Heart Disease Numeric Data to Naïve Bayes Model
Fitting Heart Disease All Data Types to Naïve Bayes Model
Discriminant Analysis (DA) Classifier for Numeric Predictors Only
Discriminant Analysis (DA): Heart Disease Numeric Predictors
Support Vector Machine (SVM) Classification Model for All Data Types
Properties of SVM Model
SVM Classification Model: Heart Disease Numeric Data Types
SVM Classification Model: Heart Disease All Data Types
Multiclass Support Vector Machine (fitcecoc) Model
Multiclass Support Vector Machines Model: Red Wine Data
Binary Linear Classifier (fitclinear) to High-Dimensional Data
CHW 2.1 Mushroom Edibility Data
CHW 2.2 1994 Adult Census Income Data
CHW 2.3 White Wine Classification
CHW 2.4 Cardiac Arrhythmia Data
CHW 2.5 Breast Cancer Diagnosis
3 Methods of Improving ML Predictive Models
Accuracy and Robustness of Predictive Models
Evaluating a Model: Cross-Validation
Cross-Validation Tune-up Parameters
Partition with K-Fold: Heart Disease Data Classification
Reducing Predictors: Feature Transformation and Selection
Factor Analysis
Feature Transformation and Factor Analysis: Heart Disease Data
Feature Selection
Feature Selection Using predictorImportance Function: Health Disease Data
Sequential Feature Selection (SFS): sequentialfs Function with Model Error Handler
Accommodating Categorical Data: Creating Dummy Variables
Feature Selection with Categorical Heart Disease Data
Ensemble Learning
Creating Ensembles: Heart Disease Data
Ensemble Learning: Wine Quality Classification
Improving fitcensemble Predictive Model: Abalone Age Prediction
Improving fitctree Predictive Model with Feature Selection (FS): Credit Ratings Data
Improving fitctree Predictive Model with Feature Transformation (FT): Credit Ratings Data
Using MATLAB Regression Learner
Feature Selection and Feature Transformation Using Regression Learner App
Feature Selection Using Neighborhood Component Analysis (NCA) for Regression: Big Car Data
CHW 3.1 The Ionosphere Data
CHW 3.2 Sonar Dataset
CHW 3.3 White Wine Classification
CHW 3.4 Small Car Data (Regression Case)
4 Methods of ML Linear Regression
Introduction
Linear Regression Models
Fitting Linear Regression Models Using fitlm Function
How to Organize the Data?
Results Visualization: Big Car Data
Fitting Linear Regression Models Using fitglm Function
Nonparametric Regression Models
fitrtree Nonparametric Regression Model: Big Car Data
Support Vector Machine, fitrsvm, Nonparametric Regression Model: Big Car Data
Nonparametric Regression Model: Gaussian Process Regression (GPR)
Regularized Parametric Linear Regression
Ridge Linear Regression: The Penalty Term
Fitting Ridge Regression Models
Predicting Response Using Ridge Regression Models
Determining Ridge Regression Parameter,?
The Ridge Regression Model: Big Car Data
The Ridge Regression Model with Optimum?: Big Car Data
Regularized Parametric Linear Regression Model: Lasso
Stepwise Parametric Linear Regression
Fitting Stepwise Linear Regression
How to Specify stepwiselm Model?
Stepwise Linear Regression Model: Big Car Data
CHW 4.1 Boston House Price
CHW 4.2 The Forest Fires Data
CHW 4.3 The Parkinson’s Disease Telemonitoring Data
CHW 4.4 The Car Fuel Economy Data
5 Neural Networks
Introduction
Feed-Forward Neural Networks
Feed-Forward Neural Network Classification
Feed-Forward Neural Network Regression
Numeric Data: Dummy Variables
Neural Network Pattern Recognition (nprtool) Application
Command-Based Feed-Forward Neural Network Classification: Heart Data
Neural Network Regression (nftool)
Command-Based Feed-Forward Neural Network Regression: Big Car Data
Training the Neural Network Regression Model Using fitrnet Function: Big Car Data
Finding the Optimum Regularization Strength for Neural Network Using Cross-Validation: Big Car Data
Custom Hyperparameter Optimization in Neural Network Regression: Big Car Data
CHW 5.1 Mushroom Edibility Data
CHW 5.2 1994 Adult Census Income Data
CHW 5.3 Breast Cancer Diagnosis
CHW 5.4 Small Car Data (Regression Case)
CHW 5.5 Boston House Price
6 Pretrained Neural Networks: Transfer Lear
Deep Learning: Image Networks
Data Stores in MATLAB
Image and Augmented Image Datastores
Accessing an Image File
Retraining: Transfer Learning for Image Recognition
Convolutional Neural Network (CNN) Layers: Channels and Activations
Convolution 2-D Layer Features via Activations
Extraction and Visualization of Activations
A 2-D (or 2-D Grouped) Convolutional Layer
Features Extraction for Machine Learning
Image Features in Pretrained Convolutional Neural Networks (CNNs)
Classification with Machine Learning
Feature Extraction for Machine Learning: Flowers
Pattern Recognition Network Generation
Machine Learning Feature Extraction: Spectrograms
Network Object Prediction Explainers
Occlusion Sensitivity
imageLIME Features Explainer
gradCAM Features Explainer
HCW 6.1 CNN Retraining for Round Worms Alive or Dead Prediction
HCW 6.2 CNN Retraining for Food Images Prediction
HCW 6.3 CNN Retraining for Merchandise Data Prediction
HCW 6.4 CNN Retraining for Musical Instrument Spectrograms Prediction
HCW 6.5 CNN Retraining for Fruit/Vegetable Varieties Prediction
7 A Convolutional Neural Network (CNN) Architecture and Training
A Simple CNN Architecture: The Land Satellite Images
Displaying Satellite Images
Training Options
Mini Batches
Learning Rates
Gradient Clipping
Algorithms
Training a CNN for Landcover Dataset
Layers and Filters
Filters in Convolution Layers
Viewing Filters: AlexNet Filters
Validation Data
Using shuffle Function
Improving Network Performance
Training Algorithm Options
Training Data
Architecture
Image Augmentation: The Flowers Dataset
Directed Acyclic Graphs Networks
Deep Network Designer (DND)
Semantic Segmentation
Analyze Training Data for Semantic Segmentation
Create a Semantic Segmentation Network
Train and Test the Semantic Segmentation Network
HCW 7.1 CNN Creation for Round Worms Alive or Dead Prediction
HCW 7.2 CNN Creation for Food Images Prediction
HCW 7.3 CNN Creation for Merchandise Data Prediction
HCW 7.4 CNN Creation for Musical Instrument Spectrograms Prediction
HCW 7.5 CNN Creation for Chest X-ray Prediction
HCW 7.6 Semantic Segmentation Network for CamVid Dataset
8 Regression Classification: Object Detection
Preparing Data for Regression
Modification of CNN Architecture from Classification to Regression
Root-Mean-Square Error
AlexNet-Like CNN for Regression: Hand-Written Synthetic Digit Images
A New CNN for Regression: Hand-Written Synthetic Digit Images
Deep Network Designer (DND) for Regression
Loading Image Data
Generating Training Data
Creating a Network Architecture
Importing Data
Training the Network
Test Network
YOLO Object Detectors
Object Detection Using YOLO v4
COCO-Based Creation of a Pretrained YOLO v4 Object Detector
Fine-Tuning of a Pretrained YOLO v4 Object Detector
Evaluating an Object Detector
Object Detection Using R-CNN Algorithms
R-CNN
Fast R-CNN
Faster R-CNN
Transfer Learning (Re-Training)
R-CNN Creation and Training
Fast R-CNN Creation and Training
Faster R-CNN Creation and Training
evaluateDetectionPrecision Function for Precision Metric
evaluateDetectionMissRate for Miss Rate Metric
HCW 8.1 Testing yolov4ObjectDetector and fasterRCNN Object Detector
HCW 8.2 Creation of Two CNN-based yolov4ObjectDetectors
HCW 8.3 Creation of GoogleNet-Based Fast R-CNN Object Detector
HCW 8.4 Creation of a GoogleNet-Based Faster R-CNN Object Detector
HCW 8.5 Calculation of Average Precision and Miss Rate Using GoogleNet-Based Faster R-CNN Object Detector
HCW 8.6 Calculation of Average Precision and Miss Rate Using GoogleNet-Based yolov4 Object Detector
HCW 8.7 Faster RCNN-based Car Objects Prediction and Calculation of Average Precision for Training and Test Data
9 Recurrent Neural Network (RNN)
Long Short-Term Memory (LSTM) and BiLSTM Network
Train LSTM RNN Network for Sequence Classification
Improving LSTM RNN Performance
Sequence Length
Classifying Categorical Sequences
Sequence-to-Sequence Regression Using Deep Learning: Turbo Fan Data
Classify Text Data Using Deep Learning: Factory Equipment Failure Text Analysis – 1
Classify Text Data Using Deep Learning: Factory Equipment Failure Text Analysis – 2
Word-by-Word Text Generation Using Deep Learning – 1
Word-by-Word Text Generation Using Deep Learning – 2
Train Network for Time Series Forecasting Using Deep Network Designer (DND)
Train Network with Numeric Features
HCW 9.1 Text Classification: Factory Equipment Failure Text Analysis
HCW 9.2 Text Classification: Sentiment Labeled Sentences Data Set
HCW 9.3 Text Classification: Netflix Titles Data Set
HCW 9.4 Text Regression: Video Game Titles Data Set
HCW 9.5 Multivariate Classification: Mill Data Set
HCW 9.6 Word-by-Word Text Generation Using Deep Learning
10 Image/Video-Based Apps
Image Labeler (IL) App
Creating ROI Labels
Creating Scene Labels
Label Ground Truth
Export Labeled Ground Truth
Video Labeler (VL) App: Ground Truth Data Creation, Training, and Prediction
Ground Truth Labeler (GTL) App
Running/Walking Classification with Video Clips using LSTM
Experiment Manager (EM) App
Image Batch Processor (IBP) App
HCW 10.1 Cat Dog Video Labeling, Training, and Prediction – 1
HCW 10.2 Cat Dog Video Labeling, Training, and Prediction – 2
HCW 10.3 EM Hyperparameters of CNN Retraining for Merchandise Data Prediction
HCW 10.4 EM Hyperparameters of CNN Retraining for Round Worms Alive or Dead Prediction
HCW 10.5 EM Hyperparameters of CNN Retraining for Food Images Prediction
Appendix A Useful MATLAB Functions
A.1 Data Transfer from an External Source into MATLAB
A.2 Data Import Wizard
A.3 Table Operations
A.4 Table Statistical Analysis
A.5 Access to Table Variables (Column Titles)
A.6 Merging Tables with Mixed Columns and Rows
A.7 Data Plotting
A.8 Data Normalization
A.9 How to Scale Numeric Data Columns to Vary Between 0 and 1
A.10 Random Split of a Matrix into a Training and Test Set
A.11 Removal of NaN Values from a Matrix
A.12 How to Calculate the Percent of Truly Judged Class Type Cases for a Binary Class Response
A.13 Error Function m-file
A.14 Conversion of Categorical into Numeric Dummy Matrix
A.15 evaluateFit2 Function
A.16 showActivationsForChannel Function
A.17 upsampLowRes Function
A.18A preprocessData function
A.18B preprocessData2 function
A.19 processTurboFanDataTrain function
A.20 processTurboFanDataTest Function
A.21 preprocessText Function
A.22 documentGenerationDatastore Function
A.23 subset Function for an Image Data Store Partition
Index
End User License Agreement