DyNA is a framework for dynamic, data-driven nonlinear signal propagation, inspired by biological neural networks.
-
Updated
May 25, 2024 - Python
DyNA is a framework for dynamic, data-driven nonlinear signal propagation, inspired by biological neural networks.
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
This code are implementation of activation that are used in neural network in every node of neuron.
Machine Learning Library for C++
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
hyper-sinh: An Accurate and Reliable Activation Function from Shallow to Deep Learning in TensorFlow, Keras, and PyTorch
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
Custom implementations of L1/L2/BCE/CE loss and ReLU/Sigmoid/Tanh/Softmax activation functions.
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
This is the recent work of my on the importance and application of mathematical function around its Hilbert function theory on artificial intelligence algorithms. The main motivation was the desire of improving the convergence rate and learning rate of various learning algorithms via Generalized Gaussian Radial Basis Function.
INTRODUCTION OF DEEP LEARNING
This project involves building an Artificial Neural Network (ANN) for predicting customer churn. The dataset used contains various customer attributes, and the ANN is trained to predict whether a customer is likely to leave the bank.
This project provides an interactive dashboard for analyzing the different activation function in neural networks. LeviLayer is a novel activation function that has shown promising results in various deep learning tasks. With this dashboard, users can explore the behavior of LeviLayer and compare it with other popular activation functions.
深度学习系统笔记,包含深度学习数学基础知识、神经网络基础部件详解、深度学习炼丹策略、模型压缩算法详解,以及如何实现深度学习推理框架实战。
This repository contains about final project for bachelor degree
It is small Web app for Visualization of Activation Function
🐱 Feedforward Neural Network from a very scratch in pure Python - backpropagation, gradient descent, activation functions
Add a description, image, and links to the activation-functions topic page so that developers can more easily learn about it.
To associate your repository with the activation-functions topic, visit your repo's landing page and select "manage topics."