Hyperparameter Optimization in Machine Learning: Make Your Machine Learning and Deep Learning Models More Efficient

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods. This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you’ll discuss Bayesian optimization for hyperparameter search, which learns from its previous history. The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you’ll focus on different aspects such as creation of search spaces and distributed optimization of these libraries. Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script. Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work. What You Will Learn Discover how changes in hyperparameters affect the model’s performance. Apply different hyperparameter tuning algorithms to data science problems Work with Bayesian optimization methods to create efficient machine learning and deep learning models Distribute hyperparameter optimization using a cluster of machines Approach automated machine learning using hyperparameter optimization Who This Book Is For Professionals and students working with machine learning.

Author(s): Tanay Agrawal
Publisher: Apress
Year: 2020

Language: English
Pages: 185
City: New York

Table of Contents
About the Author
About the Technical Reviewer
Acknowledgments
Foreword 1
Foreword 2
Introduction
Chapter 1: Introduction to Hyperparameters
Introduction to Machine Learning
Understanding Hyperparameters
The Need for Hyperparameter Optimization
Algorithms and Their Hyperparameters
K-Nearest Neighbor
Support Vector Machine
Decision Tree
Neural Networks
Distribution of Possible Hyperparameter Values
Discrete Variables
Continuous Variables
Probabilistic Distributions
Uniform Distribution
Gaussian Distribution
Exponential Distribution
Chapter 2: Hyperparameter Optimization Using Scikit-Learn
Changing Hyperparameters
Grid Search
Random Search
Parallel Hyperparameter Optimization
Chapter 3: Solving Time and Memory Constraints
Dask
Dask Distributed
Parallel Collections
Dynamic Task Scheduling
Hyperparameter Optimization with Dask
Dask Random Search and Grid Search
Incremental Search
Successive Halving Search
Hyperband Search
Distributing Deep Learning Models
PyTorch Distributed
Horovod
Chapter 4: Bayesian Optimization
Sequential Model-Based Global Optimization
Tree-Structured Parzen Estimator
Hyperopt
Search Space
Parallelizing Trials in TPE
Hyperopt-Sklearn
Hyperas
Chapter 5: Optuna and AutoML
Optuna
Search Space
Underlying Algorithms
Visualization
Distributed Optimization
Automated Machine Learning
Building Your Own AutoML Module
TPOT
Appendix I
Data Cleaning and Preprocessing
Dealing with Nonnumerical Columns
Label Encoding
One-Hot Encoding
Missing Values
Drop the Rows
Mean/Median or Most Frequent/Constant
Imputation Using Regression or Classification
Multivariate Imputation by Chained Equations1
Outlier Detection
Z-score
Density-Based Spatial Clustering of Applications with Noise
Feature Selection
F-Test
Mutual Information Test
Recursive Feature Selection
Applying the Techniques
Applying Machine Learning Algorithms
Model Evaluation Methods
Appendix II:
Neural Networks: A Brief Introduction to PyTorch and Keras API
Index