Machine Learning for Beginners: Build and deploy Machine Learning systems using Python

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

The second edition of “Machine Learning for Beginners” addresses key concepts and subjects in Machine Learning. The book begins with an introduction to the foundational principles of machine learning, followed by a discussion of data preprocessing. It then delves into feature extraction and feature selection, providing comprehensive coverage of various techniques such as the Fourier transform, short-time Fourier transform, and local binary patterns. Moving on, the book discusses principal component analysis and linear discriminant analysis. Next, the book covers the topics of model representation, training, testing, and cross-validation. It emphasizes regression and classification, explaining and implementing methods such as gradient descent. Essential classification techniques, including k-nearest neighbors, logistic regression, and naive Bayes, are also discussed in detail. The book then presents an overview of neural networks, including their biological background, the limitations of the perceptron, and the backpropagation model. It also covers support vector machines and kernel methods. Decision trees and ensemble models are also discussed. The final section of the book provides insight into unsupervised learning and deep learning, offering readers a comprehensive overview of these advanced topics. What you will learn: - Acquire skills to effectively prepare data for machine learning tasks. - Learn how to implement learning algorithms from scratch. - Harness the power of scikit-learn to efficiently implement common algorithms. - Get familiar with various Feature Selection and Feature Extraction methods. Who this book is for: This book is for both undergraduate and postgraduate Computer Science students as well as professionals looking to transition into the captivating realm of Machine Learning, assuming a foundational familiarity with Python.

Author(s): Dr. Harsh Bhasin
Edition: 2
Publisher: BPB Online
Year: 2023

Language: English
Pages: 472

Cover
Title Page
Copyright Page
Dedication Page
About the Author
About the Reviewers
Acknowledgement
Preface
Table of Contents
Section I: Fundamentals
1. An Introduction to Machine Learning
Introduction
Structure
Objectives
Conventional Algorithm and Machine Learning
Types of Learning
Supervised Machine Learning
Unsupervised Learning
Semi-supervised Learning
Reinforcement Learning
Applications
Natural Language Processing
Weather Forecasting
Robot Control
Speech Recognition
Business Intelligence
History
Case Study I - YouTube Recommendation System
Case Study II - Detection of Alzheimer’s Disease
Fun with Machine Learning
Auto Draw
Night café
OpenML
Generate Music: beatoven.ai
Tools for Machine Learning and Deep Learning
Conclusion
Multiple choice questions
Theory questions
Explore
2. The Beginning: Data Pre-Processing
Introduction
Structure
Objectives
Preprocessing
Missing values
Data integration
Data normalization
Conclusion
Multiple choice questions
Programming/Numerical
Theory
Bibliography
3. Feature Selection
Introduction
Structure
Objectives
Types of feature selection
Variance Threshold
Chi-Squared test
Pearson correlation
Recursive Feature Elimination
Genetic Algorithm for feature selection
Fisher Discriminant Ratio
Conclusion
Multiple choice questions
Programming/Numerical
Theory
4. Feature Extraction
Introduction
Structure
Objectives
Statistical features of data
Audio data
Fourier Transform
Short Term Fourier Transform
Discrete Wavelet Transform
Images
Patches
sklearn.feature_extraction.image.extract_patches_2d
Local Binary Patterns
Histogram of oriented gradients
Principal component analysis
Gray Level Co-occurrence Matrix
Gray Level Run Length
Case study: Face classification
Data
Conversion to grayscale
Feature extraction
Splitting of data
Feature selection
Forward feature selection
Classifier
Observation and conclusion
Conclusion
Multiple choice questions
Theory
Programming
5. Model Development
Introduction
Structure
Objectives
Machine Learning pipeline
Frameworks
Train test validation data
Underfitting and overfitting
Bias and variance
Bias and underfitting
How to reduce Bias
How to reduce Variance
Evaluating a model: Performance measures for Classification
Conclusion
Multiple choice questions
Theory
Explore
Section II: Supervised Learning
6. Regression
Introduction
Structure
Objectives
The line of best fit
Evaluating Regression
Gradient descent method
Implementation
Linear regression using SKLearn
Finding weights without iteration
Regression using K-nearest neighbors
Predicting Popularity of a song using Regression
Conclusion
Multiple choice questions
Theory
Experiments
7. K-Nearest Neighbors
Introduction
Structure
Objectives
Motivation
Nearest neighbor
K Nearest Neighbors
Algorithm
Implementation from Scratch
Issues
Decision boundary
K Neighbors Classifier in SKLearn
Regression using K Nearest Neighbors
Algorithm
Selecting the value of K
Experiments—K Nearest Neighbors
Conclusion
Multiple choice questions
Theory/Application
Explore
Bibliography
Lecture notes
SKLearn
Base paper
8. Classification: Logistic Regression and Naïve Bayes Classifier
Introduction
Structure
Objectives
Basics
Logistic Regression
Logistic Regression using SKLearn
Experiments: Logistic Regression
Naïve Bayes Classifier
The GaussianNB Classifier of SKLearn
Implementation of Gaussian Naïve Bayes’
Conclusion
Multiple choice questions
Theory
Numerical/ programs
9. Neural Network I: The Perceptron
Introduction
Structure
Objectives
The brain
The neuron
The McCulloch Pitts model
Limitations of the McCulloch Pitts
The Rosenblatt perceptron model
Algorithm
Activation functions
Unit step
sgn
Sigmoid
Derivative
tan-hyperbolic (tanh)
Implementation
Learning
Perceptron using sklearn
Experiments
Conclusion
Multiple choice questions
Theory questions
Programming/Experiments
10. Neural Network II: The Multi-Layer Perceptron
Introduction
Structure
Objectives
History of neutral networks
Introduction to Multi-Layer Perceptron
Architecture
Back-propagation algorithm
Halt
Learning
Implementation
Multilayer Perceptron using SKLearn
Experiments
Conclusion
Multiple choice questions
Theory questions
Practical/Coding
Lecture notes
11. Support Vector Machines
Introduction
Structure
Objectives
Maximum Margin Classifier
Maximizing the margins
The non-separable patterns and the cost parameter
The kernel trick
SKLEARN.SVM.SVC
Experiments
Conclusion
Multiple choice questions
Theory questions
Experiments
12. Decision Trees
Introduction
Structure
Objectives
Introduction to Decision Trees
Terminology
Information Gain and Gini Index
Information Gain
Gini Index
Coming back
Containing the depth of a tree
Implementation of a decision tree using SKLearn
Experiments
Experiment 1 – Iris Dataset, three classes
Experiment 2 – Breast Cancer dataset, two classes
Conclusion
Multiple choice questions
Theory
Numerical/Programming
13. An Introduction to Ensemble Learning
Introduction
Structure
Objectives
Boosting
Types of Boosting
Random Forests
Implementations
Preparing data for classification
Conclusion
Multiple choice questions
Applications
References
Section III: Unsupervised Learning and Deep Learning
14. Clustering
Introduction
Structure
Objectives
Supervised Learning
Clustering
Clustering
Applications of clustering
K-means
Algorithm: K Means
Segmentation using K Means
Finding the optimal number of clusters
Spectral clustering
Algorithm –Spectral clustering
Hierarchical clustering
Implementation
K-means
Experiment 1
Experiment 2
Experiment 3
Spectral clustering
Experiment 4
Experiment 5
Experiment 6
Agglomerative clustering
Experiment 7
Experiment 8
Experiment 9
DBSCAN
Conclusion
Multiple choice questions
Theory
Numerical
Programming
References
15. Deep Learning
Introduction
Structure
Objectives
Definitions
How is Deep Learning different from Machine Learning
The factors that promoted Deep Learning
Recap: Deep Neural Networks
Convolutional Neural Network
First CNN to recognize OCR (Le Net)
Le Net architecture
Applications of Deep Learning
Conclusion
Multiple choice questions
Theory
Bibliography
Appendix
Appendix 1: Glossary
Artificial Intelligence
Machine Learning
Deep Learning
Supervised Learning
Unsupervised Learning
Semi-Supervised Learning
Reinforcement Learning
Feature Selection
Filter methods
Wrapper methods
Overfitting
Underfitting
Bias and Underfitting
Variance
Appendix 2: Methods/Techniques
Preprocessing steps
Train Test Split
K-Fold Validation
Machine Learning pipeline
Techniques of Feature Selection
Feature Extraction
Gradient Descent
Back-propagation Algorithm
Regression and Classification methods
Steps to create a Decision Tree (using Entropy)
Selecting the value of K
Appendix 3: Important Metrics and Formulas
Classification Metrics
Confusion Matrix
Performance measures
Regression Metrics
Euclidean Distance
Manhattan Distance
Minkowski Distance
Entropy
Gini Index
Appendix 4: Visualization- Matplotlib
Introduction
Line chart
Curve
Multiple Vectors
Scatter Plot
Box Plot
Histogram
Pie chart
Case Study
References
Answers to Multiple Choice Questions
Bibliography
Index