CycleSL: Server-Client Cyclical Update Driven Scalable Split Learning
-
Updated
May 23, 2024 - Python
CycleSL: Server-Client Cyclical Update Driven Scalable Split Learning
Workshop on the course "Computer decision support systems" at V. N. Karazin Kharkiv National University
Code for the paper "Let’s Make Block Coordinate Descent Go Fast"
This is implementation Lasso with Coordinate Descent and LARS (Least Angle Regression).
Fast and modular solver for sparse generalized linear models
A sparsity aware implementation of "Deep Autoencoder-like Nonnegative Matrix Factorization for Community Detection" (CIKM 2018).
Implementations of various numerical optimization methods, written in plain Java.
Implementation of optimization algorithms in python
Solving Quadratic Programming problem with LASSO constraint
Hello! All codes belong to me. I created those codes for my Machine Learning Lab Class. Enjoy it!
Library of Semi-Relaxed Optimal Transport
Implementation of Relaxed Lasso Algorithm for Linear Regression.
Implementation of Relaxed Lasso Algorithm for Linear Regression.
Associated codebase for Byzantine-resilient distributed / decentralized machine learning papers from INSPIRE Lab
Block coordinate descent for group lasso
a python script of a function summarize some popular methods about gradient descent
Explanations and Python implementations of Ordinary Least Squares regression, Ridge regression, Lasso regression (solved via Coordinate Descent), and Elastic Net regression (also solved via Coordinate Descent) applied to assess wine quality given numerous numerical features. Additional data analysis and visualization in Python is included.
Add a description, image, and links to the coordinate-descent topic page so that developers can more easily learn about it.
To associate your repository with the coordinate-descent topic, visit your repo's landing page and select "manage topics."