Massively Parallel Grid Parameter Search for scRNA-seq Analysis
-
Updated
Aug 20, 2021 - Jupyter Notebook
Massively Parallel Grid Parameter Search for scRNA-seq Analysis
Hyperparameter Optimization Using Sci-kit Optimization
Example of Neu.ro integration with NNI for hyperparameter tuning
A lightweight custom automl library.
Hyperparameter selection on machine learning models using Particle Swarm Optimization
Code written in Python 2.7 for experiment on SVM with RBF kernel on "Robust and Efficient Kernel Hyperparameter Paths with Guarantees".
MLCloudSearch presents tooling built to facilitate the use of Scikit-Learn and Tensorflow Keras Models. The name MLCloudSearch was inspired by the Machine Learning technique of searching for the best parameters to use in fitting a model, commonly referred to as Hyper-Parameter Search. Training can be done locally or on the Cloud.
Hyperparameter search algorithm
Machine Learning Term project of IOG5016 and IOG5018
Tree-of-Parzen-estimators hyperparameter optimization
Project presented in the parallel and distributed computing course of the Graduate Program in Computer Science at CEFET-RJ (2018.3)
A specialized repository for fitting the Vikhlinin model to galaxy cluster density profiles.
The notebook shows how machine learning tools and algorithms (scikit-learn, XGBoost, LightGBM) work in practice.
In this project, I use optuna library to perform a hyperparameter search for Professor Wojciech Broniowski's implementation of Ant Colony Optimization (ACO) algorithm. I use the optuna library to optimize the hyperparameters and improve the performance of the algorithm.
A simple python interface for running multiple parallel instances of a python program (e.g. gridsearch).
In this project, I use optuna library to perform a hyperparameter search for Professor Wojciech Broniowski's implementation of Genetic Algorythm algorithm, as well as comparing this algorythm to AntColonyOptimization. I use the optuna library to optimize the hyperparameters and improve the performance of the algorithm.
Easy & Quick : Docker services are using hyperopt in parallel on top of Mongodb
A wrapper for searching over Neural network architectures, built on top of MXNet framework, in particular gluon.
Additional Callbacks for Weights & Biases to monitor your models even better 🔎
I have trained two different CNN models for binary image classification to see which architecture has better accuracy, takes less time in training, how hyperparamters affect training and how many epochs do each of them need. I achieved 96% accuracy on the best model.
Add a description, image, and links to the hyperparameter-search topic page so that developers can more easily learn about it.
To associate your repository with the hyperparameter-search topic, visit your repo's landing page and select "manage topics."