F# for Machine Learning Essentials

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Author(s): Sudipta Mukherjee
Publisher: Packt Publishing
Year: 2016

Language: English
Pages: 194

Cover
Copyright
Credits
Foreword
About the Author
Acknowledgments
About the Reviewers
www.PacktPub.com
Table of Contents
Preface
Chapter 1: Introduction to Machine Learning
Objective
Getting in touch
Different areas where machine learning is being used
Why use F#?
Supervised machine learning
Training and test dataset/corpus
Some motivating real life examples of supervised learning
Nearest Neighbour algorithm (a.k.a k-NN algorithm)
Distance metrics
Decision tree algorithms
Unsupervised learning
Machine learning frameworks
Machine learning for fun and profit
Recognizing handwritten digits – your "Hello World" ML program
How does this work?
Summary
Chapter 2: Linear Regression
Objective
Different types of linear regression algorithms
APIs used
Math.NET Numerics for F# 3.7.0
Getting Math.NET
Experimenting with Math.NET
The basics of matrices and vectors (a short and sweet refresher)
Creating a vector
Creating a matrix
Finding the transpose of a matrix
Finding the inverse of a matrix
Trace of a matrix
QR decomposition of a matrix
SVD of a matrix
Linear regression method of least square
Finding linear regression coefficients using F#
Finding the linear regression coefficients using Math.NET
Putting it together with Math.NET and FsPlot
Multiple linear regression
Multiple linear regression and variations using Math.NET
Weighted linear regression
Plotting the result of multiple linear regression
Ridge regression
Multivariate multiple linear regression
Feature scaling
Summary
Chapter 3: Classification Techniques
Objective
Different classification algorithms you will learn
Some interesting things you can do
Binary classification using k-NN
How does it work?
Finding cancerous cells using k-NN: a case study
Understanding logistic regression
The sigmoid function chart
Binary classification using logistic regression (using Accord.NET)
Multiclass classification using logistic regression
How does it work?
Multiclass classification using decision trees
Obtaining and using WekaSharp
How does it work?
Predicting a traffic jam using a decision tree: a case study
Challenge yourself!
Summary
Chapter 4: Information Retrieval
Objective
Different IR algorithms you will learn
What interesting things can you do?
Information retrieval using tf-idf
Measures of similarity
Generating a PDF from a histogram
Minkowski family
L1 family
Intersection family
Inner Product family
Fidelity family or squared-chord family
Squared L2 family
Shannon's Entropy family
Similarity of asymmetric binary attributes
Some example usages of distance metrics
Finding similar cookies using asymmetric binary similarity measures
Grouping/clustering color images based on Canberra distance
Summary
Chapter 5: Collaborative Filtering
Objective
Different classification algorithms you will learn
Vocabulary of collaborative filtering
Baseline predictors
Basis of User-User collaborative filtering
Implementing basic user-user collaborative filtering using F#
Code walkthrough
Variations of gap calculations and similarity measures
Item-item collaborative filtering
Top-N recommendations
Evaluating recommendations
Prediction accuracy
Confusion matrix (decision support)
Ranking accuracy metrics
Prediction-rating correlation
Working with real movie review data (Movie Lens)
Summary
Chapter 6: Sentiment Analysis
Objective
What you will learn
A baseline algorithm for SA using SentiWordNet lexicons
Handling negations
Identifying praise or criticism with sentiment orientation
Pointwise Mutual Information
Using SO-PMI to find sentiment analysis
Summary
Chapter 7: Anomaly Detection
Objective
Different classification algorithms
Some cool things you will do
The different types of anomalies
Detecting point anomalies using IQR (Interquartile Range)
Detecting point anomalies using Grubb's test
Grubb's test for multivariate data using Mahalanobis distance
Code walkthrough
Chi-squared statistic to determine anomalies
Detecting anomalies using density estimation
Strategy to convert a collective anomaly to a point anomaly problem
Dealing with categorical data in collective anomalies
Summary
Index