Introduction to Deep Learning and Neural Networks with Python™: A Practical Guide is an intensive step-by-step guide for neuroscientists to fully understand, practice, and build neural networks. Providing math and Python™ code examples to clarify neural network calculations, by book’s end readers will fully understand how neural networks work starting from the simplest model Y=X and building from scratch. Details and explanations are provided on how a generic gradient descent algorithm works based on mathematical and Python™ examples, teaching you how to use the gradient descent algorithm to manually perform all calculations in both the forward and backward passes of training a neural network.
Author(s): Ahmed Fawzy Gad, Fatima Ezzahra Jarmouni
Publisher: Academic Press
Year: 2020
Language: English
Pages: 300
City: London
Front-Matter_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-Pyt
Front Matter
Copyright_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-Python
Copyright
Dedication_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-Pytho
Dedication
Preface_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-Python-
Preface
Acknowledgment_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-P
Acknowledgments
Ahmed Fawzy Gad
Fatima Ezzahra Jarmouni
Chapter-1---Preparing-the-devel_2021_Introduction-to-Deep-Learning-and-Neura
Preparing the development environment
Downloading and installing Python™ 3
Installing required libraries
Preparing Ubuntu® virtual machine for Kivy
Preparing Ubuntu® virtual machine for PyPy
Conclusion
Chapter-2---Introduction-to-artific_2021_Introduction-to-Deep-Learning-and-N
Introduction to artificial neural networks (ANN)
Simplest model Y = X
Error calculation
Introducing weight
Weight as a constant
Weight as a variable
Optimizing the parameter
Introducing bias
Bias as a constant
Bias as a variable
Optimizing the weight and the bias
From mathematical to graphical form of a neuron
Neuron with multiple inputs
Sum of products
Activation function
Conclusion
Chapter-3---ANN-with-1-input_2021_Introduction-to-Deep-Learning-and-Neural-N
ANN with 1 input and 1 output
Network architecture
Forward pass
Forward pass math calculations
Backward pass
Chain rule
Backward pass math calculations
Python™ implementation
Necessary functions
Preparing inputs and outputs
Forward pass
Backward pass
Training network
Conclusion
Chapter-4---Working-with-any-n_2021_Introduction-to-Deep-Learning-and-Neural
Working with any number of inputs
ANN with 2 inputs and 1 output
Math example
Python™ implementation
Code changes
Training ANN
ANN with 10 inputs and 1 output
Training ANN
ANN with any number of inputs
Inputs assignment
Weights initialization
Calculating the SOP
Calculating the SOP to weights derivatives
Calculating the weights gradients
Updating the weights
Conclusion
Chapter-5---Working-with-hi_2021_Introduction-to-Deep-Learning-and-Neural-Ne
Working with hidden layers
ANN with 1 hidden layer with 2 neurons
Forward pass
Forward pass math calculations
Backward pass
Output layer weights
Hidden layer weights
Backward pass math calculations
Output layer weights gradients
Hidden layer weights gradients
Updating weights
Python™ implementation
Forward pass
Backward pass
Complete code
Conclusion
Chapter-6---Using-any-number-o_2021_Introduction-to-Deep-Learning-and-Neural
Using any number of hidden neurons
ANN with 1 hidden layer with 5 neurons
Forward pass
Backward pass
Hidden layer gradients
Python™ implementation
Forward pass
Backward pass
More iterations
Any number of hidden neurons in 1 layer
Weights initialization
Forward pass
Backward pass
ANN with 8 hidden neurons
Conclusion
Chapter-7---Working-with-2-h_2021_Introduction-to-Deep-Learning-and-Neural-N
Working with 2 hidden layers
ANN with 2 hidden layers with 5 and 3 neurons
Editing Chapter 6 implementation to work with an additional layer
Preparing inputs, outputs, and weights
Forward pass
Backward pass
First hidden layer gradients
ANN with 2 hidden layers with 10 and 8 neurons
Conclusion
Chapter-8---ANN-with-3-hid_2021_Introduction-to-Deep-Learning-and-Neural-Net
ANN with 3 hidden layers
ANN with 3 hidden layers with 5, 3, and 2 neurons
Required changes in the forward pass
Required changes in the backward pass
Editing Chapter 7 implementation to work with 3 hidden layers
Preparing inputs, outputs, and weights
Forward pass
Working with any number of layers
Backward pass
Python™ implementation
ANN with 10 inputs and 3 hidden layers with 8, 5, and 3 neurons
Conclusion
Chapter-9---Working-with-any-num_2021_Introduction-to-Deep-Learning-and-Neur
Working with any number of hidden layers
What to do for a generic gradient descent implementation?
Generic approach for gradients calculation
Output layer gradients
Hidden layer gradients
Calculations summary
Python™ implementation
backward_pass() method
Output layer
Hidden layers
Example: Training the network
Making predictions
Conclusion
Chapter-10---Generic_2021_Introduction-to-Deep-Learning-and-Neural-Networks-
Generic ANN
Preparing initial weights for any number of outputs
Calculating gradients for all output neurons
Network with 2 outputs
Network with 3 outputs
Working with multiple training samples
Calculating the size of the inputs and the outputs
Iterating through the training samples
Calculating the network error
Implementing ReLU
New implementation for MLP class
Example for training network with multiple samples
Using bias
Initializing the network bias
Using bias in the forward pass
Updating bias using gradient descent
Complete implementation with bias
Stochastic and batch gradient descent
Example
Conclusion
Chapter-11---Running-neural-net_2021_Introduction-to-Deep-Learning-and-Neura
Running neural networks in Android
Building the first Kivy app
Getting started with KivyMD
MDTextField
MDCheckbox
MDDropdownMenu
MDFileManager
MDSpinner
Training network in a thread
Neural network KivyMD app
neural.kv
main.py
Use the app
Building the Android app
Conclusion
Index_2021_Introduction-to-Deep-Learning-and-Neural-Networks-with-Python-
Index
A
B
C
D
F
G
H
I
J
K
L
M
N
P
R
S
T
U
V
W