Skip to content

kashu98/Simple-Deep-Learning

Repository files navigation

Simple Deep Learning

This is a Python implementation of Deep Learning models and algorithms with a minimum use of external library. Simple Deep Learning aims to learn the basic concepts of deep learning by creating a library from scratch.

Installation

Activation Functions

The following activation functions are defined in activation.py as class that has forward and backward methods.

ReLU (Rectified Linear Unit)

forward:

backward:

LReLU (Leaky Rectified Linear Unit)

forward:

backward:

PReLU (Parameteric Rectified Linear Unit)

forward:

backward:

ELU (Exponential Linear Unit)

SELU (Scaled Exponential Linear Unit)

Sigmoid (Logistic Function)

forward:

backward:

SoftPlus

Tanh

Arctan

SoftSign

Layers

The following layers are defined in layers.py as class that has forward and backward methods (someof them have predict method)

Convolution Layer (3D)

This layer is compatible with minibatch and deals with a 3D tensor consists of (channel, hight, width). The input data will have a shape of (batch number, channel, hight, width).

Pooling Layer

Two options, max pooling and average pooling, are avalable for this layer.

Affine Layer

This layer is compatible with tensor expression so that you can directly connect 3D layer and fully-conected (2D) layer.

Maxout Layer

This layer can only be used in fully-conected (2D) layer.

Batch Normalization Layer

Dropout Layer

Loss Function

MAE (Mean Absolute Error)

MSE (Mean Square Error)

RMSE (Root Mean Square Error)

Reference

Following links are used as reference:
https://en.wikipedia.org/wiki/Activation_function
http://www.deeplearningbook.org/contents/optimization.html

About

Python implementations of Deep Learning models and algorithms with a minimum use of external library.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages