Practical Deep Learning: A Python-Based Introduction

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Practical Deep Learning teaches total beginners how to build the datasets and models needed to train neural networks for your own DL projects. If you’ve been curious about machine learning but didn’t know where to start, this is the book you’ve been waiting for. Focusing on the subfield of machine learning known as deep learning, it explains core concepts and gives you the foundation you need to start building your own models. Rather than simply outlining recipes for using existing toolkits, Practical Deep Learning teaches you the why of deep learning and will inspire you to explore further. All you need is basic familiarity with computer programming and high school math—the book will cover the rest. After an introduction to Python, you’ll move through key topics like how to build a good training dataset, work with the scikit-learn and Keras libraries, and evaluate your models’ performance. You’ll also learn: • How to use classic machine learning models like k-Nearest Neighbors, Random Forests, and Support Vector Machines • How neural networks work and how they’re trained • How to use convolutional neural networks • How to develop a successful deep learning model from scratch You’ll conduct experiments along the way, building to a final case study that incorporates everything you’ve learned. The perfect introduction to this dynamic, ever-expanding field, Practical Deep Learning will give you the skills and confidence to dive into your own machine learning projects.

Author(s): Ron Kneusel
Edition: 1
Publisher: No Starch Press
Year: 2021

Language: English
Commentary: Vector PDF
Pages: 464
City: San Francisco, CA
Tags: Machine Learning; Neural Networks; Deep Learning; Python; Convolutional Neural Networks; Keras; NumPy; Model Evaluation; Elementary; Datasets

Brief Contents
Contents in Detail
Foreword
Acknowledgments
Introduction
Who Is This Book For?
What Can You Expect to Learn?
About This Book
Chapter 1: Getting Started
The Operating Environment
NumPy
scikit-learn
Keras with TensorFlow
Installing the Toolkits
Basic Linear Algebra
Vectors
Matrices
Multiplying Vectors and Matrices
Statistics and Probability
Descriptive Statistics
Probability Distributions
Statistical Tests
Graphics Processing Units
Summary
Chapter 2: Using Python
The Python Interpreter
Statements and Whitespace
Variables and Basic Data Structures
Representing Numbers
Variables
Strings
Lists
Dictionaries
Control Structures
if-elif-else Statements
for Loops
while Loops
break and continue Statements
with Statement
Handling Errors with try-except Blocks
Functions
Modules
Summary
Chapter 3: Using NumPy
Why NumPy?
Arrays vs. Lists
Testing Array and List Speed
Basic Arrays
Defining an Array with np.array
Defining Arrays with 0s and 1s
Accessing Elements in an Array
Indexing into an Array
Slicing an Array
The Ellipsis
Operators and Broadcasting
Array Input and Output
Random Numbers
NumPy and Images
Summary
Chapter 4: Working with Data
Classes and Labels
Features and Feature Vectors
Types of Features
Feature Selection and the Curse of Dimensionality
Features of a Good Dataset
Interpolation and Extrapolation
The Parent Distribution
Prior Class Probabilities
Confusers
Dataset Size
Data Preparation
Scaling Features
Missing Features
Training, Validation, and Test Data
The Three Subsets
Partitioning the Dataset
k-Fold Cross Validation
Look at Your Data
Searching for Problems in the Data
Cautionary Tales
Summary
Chapter 5: Building Datasets
Irises
Breast Cancer
MNIST Digits
CIFAR-10
Data Augmentation
Why Should You Augment Training Data?
Ways to Augment Training Data
Augmenting the Iris Dataset
Augmenting the CIFAR-10 Dataset
Summary
Chapter 6: Classical Machine Learning
Nearest Centroid
k-Nearest Neighbors
Naïve Bayes
Decision Trees and Random Forests
Recursion Primer
Building Decision Trees
Random Forests
Support Vector Machines
Margins
Support Vectors
Optimization
Kernels
Summary
Chapter 7: Experiments with Classical Models
Experiments with the Iris Dataset
Testing the Classical Models
Implementing a Nearest Centroid Classifier
Experiments with the Breast Cancer Dataset
Two Initial Test Runs
The Effect of Random Splits
Adding k-fold Validation
Searching for Hyperparameters
Experiments with the MNIST Dataset
Testing the Classical Models
Analyzing Runtimes
Experimenting with PCA Components
Scrambling Our Dataset
Classical Model Summary
Nearest Centroid
k-Nearest Neighbors
Naïve Bayes
Decision Trees
Random Forests
Support Vector Machines
When to Use Classical Models
Handling Small Datasets
Dealing with Reduced Computational Requirements
Having Explainable Models
Working with Vector Inputs
Summary
Chapter 8: Introduction to Neural Networks
Anatomy of a Neural Network
The Neuron
Activation Functions
Architecture of a Network
Output Layers
Representing Weights and Biases
Implementing a Simple Neural Network
Building the Dataset
Implementing the Neural Network
Training and Testing the Neural Network
Summary
Chapter 9: Training a Neural Network
A High-Level Overview
Gradient Descent
Finding Minimums
Updating the Weights
Stochastic Gradient Descent
Batches and Minibatches
Convex vs. Nonconvex Functions
Ending Training
Updating the Learning Rate
Momentum
Backpropagation
Backprop, Take 1
Backprop, Take 2
Loss Functions
Absolute and Mean Squared Error Loss
Cross-Entropy Loss
Weight Initialization
Overfitting and Regularization
Understanding Overfitting
Understanding Regularization
L2 Regularization
Dropout
Summary
Chapter 10: Experiments with Neural Networks
Our Dataset
The MLPClassifier Class
Architecture and Activation Functions
The Code
The Results
Batch Size
Base Learning Rate
Training Set Size
L2 Regularization
Momentum
Weight Initialization
Feature Ordering
Summary
Chapter 11: Evaluating Models
Definitions and Assumptions
Why Accuracy Is Not Enough
The 2 x 2 Confusion Matrix
Metrics Derived from the 2 x 2 Confusion Matrix
Deriving Metrics from the 2 x 2 Table
Using Our Metrics to Interpret Models
More Advanced Metrics
Informedness and Markedness
F1 Score
Cohen's Kappa
Matthews Correlation Coefficient
Implementing Our Metrics
The Receiver Operating Characteristics Curve
Gathering Our Models
Plotting Our Metrics
Exploring the ROC Curve
Comparing Models with ROC Analysis
Generating an ROC Curve
The Precision–Recall Curve
Handling Multiple Classes
Extending the Confusion Matrix
Calculating Weighted Accuracy
Multiclass Matthews Correlation Coefficient
Summary
Chapter 12: Introduction to Convolutional Neural Networks
Why Convolutional Neural Networks?
Convolution
Scanning with the Kernel
Convolution for Image Processing
Anatomy of a Convolutional Neural Network
Different Types of Layers
Passing Data Through the CNN
Convolutional Layers
How a Convolution Layer Works
Using a Convolutional Layer
Multiple Convolutional Layers
Initializing a Convolutional Layer
Pooling Layers
Fully Connected Layers
Fully Convolutional Layers
Step by Step
Summary
Chapter 13: Experiments with Keras and MNIST
Building CNNs in Keras
Loading the MNIST Data
Building Our Model
Training and Evaluating the Model
Plotting the Error
Basic Experiments
Architecture Experiments
Training Set Size, Minibatches, and Epochs
Optimizers
Fully Convolutional Networks
Building and Training the Model
Making the Test Images
Testing the Model
Scrambled MNIST Digits
Summary
Chapter 14: Experiments with CIFAR-10
A CIFAR-10 Refresher
Working with the Full CIFAR-10 Dataset
Building the Models
Analyzing the Models
Animal or Vehicle?
Binary or Multiclass?
Transfer Learning
Fine-Tuning a Model
Building Our Datasets
Adapting Our Model for Fine-Tuning
Testing Our Model
Summary
Chapter 15: A Case Study: Classifying Audio Samples
Building the Dataset
Augmenting the Dataset
Preprocessing Our Data
Classifying the Audio Features
Using Classical Models
Using a Traditional Neural Network
Using a Convolutional Neural Network
Spectrograms
Classifying Spectrograms
Initialization, Regularization, and Batch Normalization
Examining the Confusion Matrix
Ensembles
Summary
Chapter 16: Going Further
Going Further with CNNs
Reinforcement Learning and Unsupervised Learning
Generative Adversarial Networks
Recurrent Neural Networks
Online Resources
Conferences
The Book
So Long and Thanks for All the Fish
Index