Nonconvex embedded optimization: code generation for fast real-time optimization
-
Updated
Mar 27, 2024 - Rust
Nonconvex embedded optimization: code generation for fast real-time optimization
Open-L2O: A Comprehensive and Reproducible Benchmark for Learning to Optimize Algorithms
A next-gen solver for nonlinearly constrained nonconvex optimization. Modular and lightweight, it unifies iterative methods (SQP vs interior points) and globalization techniques (filter method vs merit function, line search vs trust region method) in a single framework. Competitive against IPOPT, filterSQP, SNOPT, MINOS and CONOPT
Simulation code for "Fast Converging Algorithm for Weighted Sum Rate Maximization in Multicell MISO Downlink" by Le-Nam Tran, Muhammad Fainan Hanif, Antti Tolli, Markku Juntti, IEEE Signal Processing Letters 19.12 (2012): 872-875
Nonconvex Exterior Point Operator Splitting
Semi-random funky stuff, mainly for my PhD experiments and articles. Contains calculations and algorithm implementations for various applied mathematics and astrophysics articles I worked on.
Proximal Augmented Lagrangian method for Quadratic Programs
Coordinate and Incremental Aggregated Optimization Algorithms
Build a Recurrent Neural Network solving Optimization Problems
Optimized and benchmarked parallel Genetic Algorithm with inequality constraints, and a scipy-like interface
Solving Convex and Nonconvex Optimization Problems
HarvesterOpt is a code to maximize the energy recovered by a bistable energy harvester.
Quantum control algorithms for multiple examples based on the package Qutip
Fix and Bound: An efficient approach for solving large-scale BoxQPs
Simulation code for "A Conic Quadratic Programming Approach to Physical Layer Multicasting for Large-Scale Antenna Arrays." by L.-N. Tran, M. F. Hanif, M. Juntti., IEEE Signal Processing Letters 21.1 (2014): 114-117
Code and results for quantum optimal control based on switching time optimization
R package for fast optimizing type I/II error, recall/precision and F1 score.
Proof-of-concept implementation of a method by Song and Khan (2022) for computing convex relaxations for parametric ordinary differential equations.
Non-linear topology identification using Deep Learning. Sparsity (lasso) is enforced in the sensor connections. The non-convex and non-differentiable function is solved using sub-gradient descent algorithm.
Add a description, image, and links to the nonconvex-optimization topic page so that developers can more easily learn about it.
To associate your repository with the nonconvex-optimization topic, visit your repo's landing page and select "manage topics."