Introduction to Deep Learning (Black/White version): with complete Python and TensorFlow examples

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Author(s): Juergen Brauer
Year: 2018

Language: English

What is Deep Learning?
"How are they called? Neutrons?"
Convolutional Neural Networks drive the boom
Deep Learning without neurons
Neuroscience as a treasure for machine learning
About this book
Deep Learning: An agile field
Exponential growth of interest
Acquisition of DL startups
Hardware for DL
Software for DL
The biological role model: The Neuron
Your brain - A fascinating computing device
Structure of a neuron
Signal processing by action potentials
Synapses
Neuronal plasticity
Spike-Timing Dependent Plasticity (STDP)
The many faces of a neuron
What is the function of a biological neuron?
Neurons as spatial feature or evidence detectors
Neurons as temporal coincidence detectors
Perceptron neuron model
Neurons as filters
Other neuron models
Neural Coding
The Perceptron
The Perceptron neuro-computer
Perceptron learning
Perceptron in Python
Limitations of the Perceptron
Self-Organizing Maps
The SOM neural network model
A SOM in Python
SOM and the Cortex
Multi Layer Perceptrons
The goal
Basic idea is gradient descent
Splitting the weight change formula into three parts
Computing the first part
Computing the second part
Computing the third part
Backpropagation pseudo code
MLP in Python
Visualization of decision boundaries
The need for non-linear transfer functions
TensorFlow
Introduction
Training a linear model with TensorFlow
A MLP with TensorFlow
Convolutional Neural Networks
Introduction
Some history about the CNN model
Convolutional and pooling layers in TensorFlow
Parameters to be defined for a convolution layer
How to compute the dimension of an output tensor
Parameters to be defined for a pooling layer
A CNN in TensorFlow
Deep Learning Tricks
Fighting against vanishing gradients
Momentum optimization
Nesterov Momentum Optimization
AdaGrad
RMSProp
Adam
Comparison of optimizers
Batch normalization
Beyond Deep Learning
Principle of attention
Principle of lifelong learning
Principle of incremental learning
Principle of embodiment
Principle of prediction
Cognitive architectures
Exercises
Ex. 1 - Preparing to work with Python
Ex. 2 - Python syntax
Ex. 3 - Understanding convolutions
Ex. 4 - NumPy
Ex. 5 - Perceptron
Ex. 6 - Speech Recognition with a SOM
Ex. 7 - MLP with feedfoward step
Ex. 8 - Backpropagation
Ex. 9 - A MLP with TensorFlow
Ex. 10 - CNN Experiments
Ex. 11 - CNN for word recognition using Keras
Ex. 12 - Vanishing gradients problem
Ex. 13 - Batch normalization in TensorFlow