Deep Learning with Exponential Linear Units (ELUs) activation function
-
Updated
Feb 18, 2018 - Python
Deep Learning with Exponential Linear Units (ELUs) activation function
Project Regarding Analysis and Implementation of https://arxiv.org/abs/1602.02068
The Deep Learning exercises provided in DataCamp
Jupyter Notebooks for Visualization
Explains the basic concepts of NN like activation functions, forward propagation, backward propagation, gradient descent, finding the optimized weights and bias etc.
Bag of: activation functions, L2Norm function, cosine similarity function, how to read glove vector files
To draw activation functions, i.e., to get images of activation functions
Graduated optimization on neural networks via adjustment of activation functions
[TCAD 2018] Code for “Design Space Exploration of Neural Network Activation Function Circuits”
Essential deep learning algorithms, concepts, examples and visualizations with TensorFlow. Popular and custom neural network architectures. Applications of neural networks.
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Growing collection of different machine learning metrics.
Artificial Neural Networks Activation Functions
A Basic Introduction To Neural Networks: Machine Being Human!
A teacher-student activation layer model based on perceptrons, implemented in PyTorch
This is a Keras implementation of the paper "LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent Activation Function for Neural Networks" - https://arxiv.org/abs/1901.05894
Visualizations of various activation functions for neural networks in TensorFlow
Bionodal Root Unit (BRU) implemented with Chainer-v5
Add a description, image, and links to the activation-functions topic page so that developers can more easily learn about it.
To associate your repository with the activation-functions topic, visit your repo's landing page and select "manage topics."