This work is licensed under a Creative Commons Attribution 4.0 International License
This repository contains the basic implementation of neurons and neural networks using PyTorch
tensors.
The activation functions are defined inside the neuron.activation_functions
package.
The relu function of a tensor T is the element-wise max between 0 and the appropriate element of T.
It's definition is given by the function relu
.
The Swish function, as proposed by by Ramachandran et al. on their paper "Searching for Activation
Functions" (arXiv:1710.05941v2).
The implementation is given by the swish
function.