Projects done in the AI learning algorithm class @ UCSD
-
Updated
Apr 2, 2020 - Python
Projects done in the AI learning algorithm class @ UCSD
Code for the paper "Block-coordinate primal-dual algorithm for linearly constrained optimization problem"
Sample Convex Optimization using Gradient Descent, Newton's method and Coordinate Descent
Implementation of optimization algorithms in python
CycleSL: Server-Client Cyclical Update Driven Scalable Split Learning
Predicting goodness points of a wine given its reviews
Implementation of Relaxed Lasso Algorithm for Linear Regression.
Implementations of various numerical optimization methods, written in plain Java.
Workshop on the course "Computer decision support systems" at V. N. Karazin Kharkiv National University
The implementation of Coordinate Descent Method Accelerated by Universal Metaalgorithm with efficient amortised complexity of iteration & Experiments with sparse SoftMax function, where the proposed method is better than FGM
Hello! All codes belong to me. I created those codes for my Machine Learning Lab Class. Enjoy it!
The lasso function (from scratch) to estimate parameters for an nxp matrix when p >> n
Implementation of algorithms for stereo vision
Powell's example of cyclic non-convergence in coordinate descent with exact minimization
Solving Quadratic Programming problem with LASSO constraint
This repository contains codes and functions for Ridge Regression (Normal Eqquation method and Coordinate Descent method) and Lasso Regression (Coordinate Descent method). There is some analysis of the preformance of these funcions/models. There is also a comparison of these with the sklearn.
Library of Semi-Relaxed Optimal Transport
Add a description, image, and links to the coordinate-descent topic page so that developers can more easily learn about it.
To associate your repository with the coordinate-descent topic, visit your repo's landing page and select "manage topics."