Skip to content

r8vnhill/you-activated-my-neuron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

You activated my neuron!

http://creativecommons.org/licenses/by/4.0/

This work is licensed under a Creative Commons Attribution 4.0 International License

This repository contains the basic implementation of neurons and neural networks using PyTorch tensors.

Activation functions

The activation functions are defined inside the neuron.activation_functions package.

Rectifier Linear Unit (ReLU)

The relu function of a tensor T is the element-wise max between 0 and the appropriate element of T. It's definition is given by the function relu.

Swish

The Swish function, as proposed by by Ramachandran et al. on their paper "Searching for Activation Functions" (arXiv:1710.05941v2). The implementation is given by the swish function.

About

Implementation of basic neurons and neural networks using PyTorch tensors

Topics

Resources

Stars

Watchers

Forks

Languages