Activation Functions

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book describes the functions frequently used in deep neural networks.

Author(s): Yasin Kutuk
Publisher: Peter Lang
Year: 2022

Language: English
Pages: 84

Cover
Title
Copyright
About the author
About the book
This eBook can be cited
Table of Contents
Introduction
1. Machine Learning
1.1. Types of Machine Learning
1.2. Supervised Learning
1.2.1. Regression
1.2.2. Classification and Logistic Regression
1.3. Unsupervised Learning
1.3.1. Clustering
1.4. Semi-supervised Learning
1.5. Reinforcement Learning
1.6. Federated Learning
1.7. Transfer Learning
1.8. Ensemble Learning
2. Neural Networks
2.1. Single Layer Perceptron
2.2. Deep Neural Networks
2.3. Architecture Design
2.3.1. Feed Forwards
2.3.2. Convolutional Neural Networks
2.3.3. Sequence Modeling
2.3.3.1. Recurrent Neural Networks
3. Activation Functions
4. Monotonic Activation Functions
4.1. Linear Function
4.1.1. Identity Function
4.1.2. Piecewise Linear Function
4.2. Threshold (Unit Heaviside, Binary, Step) Function
4.3. Sigmoid Function
4.3.1. Bipolar Sigmoid Function
4.4. Rectified Linear Unit (ReLU)
4.4.1. Leaky Rectified Linear Unit (LReLU)
4.4.2. Parametric Rectified Linear Unit (PReLU)
4.4.3. Randomized Rectified Linear Unit (RReLU)
4.5. Exponential Linear Unit (ELU)
4.5.1. Scaled Exponential Linear Unit (SELU)
4.6. SoftMax Function
4.7. Odd Activation (Signum, Sign) Function
4.8. Maxout Function
4.9. Softsign Function
4.10. Elliott Function
4.11. Hyperbolic Tangent (Tanh) Function
4.11.1. Arc Tangent Function
4.11.2. Lecun’s Hyperbolic Tangent Function
4.12. Complementary log-log Function
4.13. Softplus Function
4.14. Bent Identity Function
4.15. Soft Exponential Function
5. Periodic Activation Functions
5.1. Sinusoidals
5.1.1. Sine Wave Function
5.1.2. Cardinal Sine Function (Sinc)
5.1.3. Fourier Transform (FT, DFT/FFT)
5.1.4. Discrete-time Dimensional Fourier Transform (DTFT, Shannon-Nyquist)
5.1.5. Short-Time Fourier Transform (Gabor, STFT)
5.1.6. Wavelet Transform
5.2. Non-sinusoidals
5.2.1. Gaussian (Normal) Distribution Function
5.2.2. Square Wave Function
5.2.3. Triangle Wave Function
5.2.4. Sawtooth Wave Function
5.2.5. S-shaped Rectifed Linear Unit (SReLU)
5.2.6. Adaptive Piecewise Linear Unit (APLU)
6. Bias Unit
References
LaTeX Applications