A Deep Learning Framework Just for Fun and Education! 🥳
Its purpose is to be a readable and quick to understand reference for building a library like PyTorch with Autodiff written completely in Python.
$ pip install meerkat_dl
- Automatic Differentiation
- Tensor Operations
- Computation Graph
- Layers:
- Dense
- RNN
- LSTM
- GRU
- Convolution (2D and 3D)
- Attention
- Dropout
- MaxPool
- BatchNorm
- LayerNorm
- Initialization
- He
- Xavier/Glorot
- LeCun
- Activation Functions
- ReLU
- TanH
- Sigmoid
- Softmax
- Optimizers
- Gradient Descent
- Stochastic Gradient Descent
- Adam
- Momentum
- Adagrad
- RMSProp
- Adadelta
- Learning Rate Scheduling
- Step Decay
- Exponential Decay
- Learning Rate Warmup
- Cosine Annealing
- Loss Functions
- Mean Squared Error
- Cross Entropy
- Binary Cross Entropy
- Hinge
- Margin
- Triplet
- KL
- NLL
- MAE
- Dice
- Gradient checker
- Remove duplicate code in Operations forward pass
- Fix Min, Max, Mean, Flatten operations
- Add regularization
- L1/L2
- DropBlock
- Label Smoothing
- Add reset graph for backward pass
- Eval mode
- Checkpointing
- Serializing
- Add named parameters and named modules
- Model Summary Functionality
- Dataset Loader
- Write Tests
- Write Examples
- Support GPU
Interested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.
meerkat_dl
was created by Raghav Saboo. It is licensed under the terms of the MIT license.
Paszke, Adam, et al. "Automatic differentiation in PyTorch." (2017)
Atilim Guns Baydin, et al. "Automatic differentiation in machine learning: a survey" (2018).
meerkat_dl
was created with cookiecutter
and the py-pkgs-cookiecutter
template.