Several machine learning classifiers in Python
-
Updated
Sep 26, 2020 - Python
Several machine learning classifiers in Python
Faster alternative to Fast Feedforward Layer that uses angular distance for routing
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
Using CCR to predict piezoresponse force microscopy datasets
About Code repository for: Nguyen, H., Nguyen, T., Nguyen, K., & Ho, N. (2024). Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts. In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, AISTATS 2024, Acceptance rate 27.6% over 1980 submissions.
Anomaly Detection by Recombining Gated Unsupervised Experts
Differentially private retriever using transformer memory as a search index for information retrieval
Welcome to the Uncertainty-aware Mixture of Experts (uMoE) GitHub repository. This repository contains the implementation and documentation for our uMoE model, designed to train Neural Networks effectively using a mixture of experts architecture.
Review on Google's multitask ranking system by comparing to other methods used in recommender systems
an LLM toolkit
Bayesian Learning for Control in Multimodal Dynamical Systems | written in Org-mode
Code, data, and pre-trained models for our EMNLP 2021 paper "Think about it! Improving defeasible reasoning by first modeling the question scenario"
This R package allows the emulation using a mesh-clustered Gaussian process (mcGP) model for partial differential equation (PDE) systems.
This instruction aims to reproduce the results in the paper “Mesh-clustered Gaussian process emulator for partial differential equation boundary value problems”(2024) to appear in Technometrics.
Gaussian Process-Gated Hierarchical Mixture of Experts
MoE Decoder Transformer implementation with MLX
This is a prototype of a MixtureOfExpert LLM made with pytorch. Currently in developpment, I am testing its capabilities of learning with simple little tests before learning it on large language datasets.
This collaborative framework is designed to harness the power of a Mixture of Experts (MoE) to automate a wide range of software engineering tasks, thereby enhancing code quality and expediting development processes.
Anomaly detection using ARGUE - an advanced mixture-of-experts autoencoder model
Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."