Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
-
Updated
May 10, 2024 - Jupyter Notebook
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
Rethinking Image Inpainting via a Mutual Encoder Decoder with Feature Equalizations. ECCV 2020 Oral
PyTorch implementation of Sinusodial Representation networks (SIREN)
深度学习系统笔记,包含深度学习数学基础知识、神经网络基础部件详解、深度学习炼丹策略、模型压缩算法详解,以及如何实现深度学习推理框架实战。
All the code files related to the deep learning course from PadhAI
Korean OCR Model Design(한글 OCR 모델 설계)
AReLU: Attention-based-Rectified-Linear-Unit
Unofficial implementation of 'Implicit Neural Representations with Periodic Activation Functions'
Intro to Deep Learning by National Research University Higher School of Economics
Implementing activation functions from scratch in Tensorflow.
Image to Image Translation using Conditional GANs (Pix2Pix) implemented using Tensorflow 2.0
💩 Sigmoid Colon: The biologically inspired activation function.
Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
[TCAD 2018] Code for “Design Space Exploration of Neural Network Activation Function Circuits”
Interactive visualizations and demos that are used in a blog post I wrote about logic in the context of neural networks
Unofficial pytorch implementation of Piecewise Linear Unit dynamic activation function
Add a description, image, and links to the activation-functions topic page so that developers can more easily learn about it.
To associate your repository with the activation-functions topic, visit your repo's landing page and select "manage topics."