You've decided to tackle machine learning - because you're job hunting, embarking on a new project, or just think self-driving cars are cool. But where to start? It's easy to be intimidated, even as a software developer. The good news is that it doesn't have to be that hard. Master machine learning by writing code one line at a time, from simple learning programs all the way to a true deep learning system. Tackle the hard topics by breaking them down so they're easier to understand, and build your confidence by getting your hands dirty.
Peel away the obscurities of machine learning, starting from scratch and going all the way to deep learning. Machine learning can be intimidating, with its reliance on math and algorithms that most programmers don't encounter in their regular work. Take a hands-on approach, writing the Python code yourself, without any libraries to obscure what's really going on. Iterate on your design, and add layers of complexity as you go.
Build an image recognition application from scratch with supervised learning. Predict the future with linear regression. Dive into gradient descent, a fundamental algorithm that drives most of machine learning. Create perceptrons to classify data. Build neural networks to tackle more complex and sophisticated data sets. Train and refine those networks with backpropagation and batching. Layer the neural networks, eliminate overfitting, and add convolution to transform your neural network into a true deep learning system.
Start from the beginning and code your way to machine learning mastery.
What You Need:
The examples in this book are written in Python, but don't worry if you don't know this language: you'll pick up all the Python you need very quickly. Apart from that, you'll only need your computer, and your code-adept brain.
Author(s): Paolo Perrotta
Edition: 1
Publisher: Pragmatic Bookshelf
Year: 2020
Language: English
Commentary: Vector PDF
Pages: 342
City: Raleigh, NC
Tags: Machine Learning; Neural Networks; Deep Learning; Convolutional Neural Networks; Classification; Keras; Gradient Descent; Hyperparameter Tuning; Linear Regression; Logistic Regression; Perceptron; Overfitting; Testing; Activation Functions; Batch Learning; Backpropagation
Cover
Table of Contents
Acknowledgments
How the Heck Is That Possible?
About This Book
Before We Begin
Part I—From Zero to Image Recognition
1. How Machine Learning Works
Programming vs. Machine Learning
Supervised Learning
The Math Behind the Magic
Setting Up Your System
2. Your First Learning Program
Getting to Know the Problem
Coding Linear Regression
Adding a Bias
What You Just Learned
Hands On: Tweaking the Learning Rate
3. Walking the Gradient
Our Algorithm Doesn’t Cut It
Gradient Descent
What You Just Learned
Hands On: Basecamp Overshooting
4. Hyperspace!
Adding More Dimensions
Matrix Math
Upgrading the Learner
Bye Bye, Bias
A Final Test Drive
What You Just Learned
Hands On: Field Statistician
5. A Discerning Machine
Where Linear Regression Fails
Invasion of the Sigmoids
Classification in Action
What You Just Learned
Hands On: Weighty Decisions
6. Getting Real
Data Come First
Our Own MNIST Library
The Real Thing
What You Just Learned
Hands On: Tricky Digits
7. The Final Challenge
Going Multiclass
Moment of Truth
What You Just Learned
Hands On: Minesweeper
8. The Perceptron
Enter the Perceptron
Assembling Perceptrons
Where Perceptrons Fail
A Tale of Perceptrons
Part II—Neural Networks
9. Designing the Network
Assembling a Neural Network from Perceptrons
Enter the Softmax
Here’s the Plan
What You Just Learned
Hands On: Network Adventures
10. Building the Network
Coding Forward Propagation
Cross Entropy
What You Just Learned
Hands On: Time Travel Testing
11. Training the Network
The Case for Backpropagation
From the Chain Rule to Backpropagation
Applying Backpropagation
Initializing the Weights
The Finished Network
What You Just Learned
Hands On: Starting Off Wrong
12. How Classifiers Work
Tracing a Boundary
Bending the Boundary
What You Just Learned
Hands On: Data from Hell
13. Batchin’ Up
Learning, Visualized
Batch by Batch
Understanding Batches
What You Just Learned
Hands On: The Smallest Batch
14. The Zen of Testing
The Threat of Overfitting
A Testing Conundrum
What You Just Learned
Hands On: Thinking About Testing
15. Let’s Do Development
Preparing Data
Tuning Hyperparameters
The Final Test
Hands On: Achieving 99%
What You Just Learned… and the Road Ahead
Part III—Deep Learning
16. A Deeper Kind of Network
The Echidna Dataset
Building a Neural Network with Keras
Making It Deep
What You Just Learned
Hands On: Keras Playground
17. Defeating Overfitting
Overfitting Explained
Regularizing the Model
A Regularization Toolbox
What You Just Learned
Hands On: Keeping It Simple
18. Taming Deep Networks
Understanding Activation Functions
Beyond the Sigmoid
Adding More Tricks to Your Bag
What You Just Learned
Hands On: The 10 Epochs Challenge
19. Beyond Vanilla Networks
The CIFAR-10 Dataset
The Building Blocks of CNNs
Running on Convolutions
What You Just Learned
Hands On: Hyperparameters Galore
20. Into the Deep
The Rise of Deep Learning
Unreasonable Effectiveness
Where Now?
Your Journey Begins
A1. Just Enough Python
What Python Looks Like
Python’s Building Blocks
Defining and Calling Functions
Working with Modules and Packages
Creating and Using Objects
That’s It, Folks!
A2. The Words of Machine Learning
Index
– SYMBOLS –
– A –
– B –
– C –
– D –
– E –
– F –
– G –
– H –
– I –
– J –
– K –
– L –
– M –
– N –
– O –
– P –
– Q –
– R –
– S –
– T –
– U –
– V –
– W –
– X –