Skip to content

Latest commit

 

History

History
27 lines (15 loc) · 1.24 KB

README.md

File metadata and controls

27 lines (15 loc) · 1.24 KB

Spring term 2022

  1. Overview of the term topics and some applications of convex optimization (ru)

  2. Intro to numerical optimization methods. Gradient descent (ru)

  3. How to accelerate gradient descent

    • conjugate gradient method (ru)
    • heavy-ball method and fast gradient method (ru)
  4. Second order methods: Newton method. Quasi-Newton methods as trade-off between convergence speed and cost of one iterations (ru)

  5. Non-smooth optimization problems: subgradient methods and intro to proximal methods (en)

5*. Smoothing: smooth minimization of non-smooth functions (original paper)

  1. Frank-Wolfe method (ru)

  2. General purpose solvers

    • interior point methods
    • SQP as generalization of interior point methods to non-convex problems
  3. How to parallelize optimization methods: penalty method, augmented Lagrangian method and ADMM (ru)

  4. Stochastic gradient methods: non-convex non-smooth but structured objectives. Training neural networks as a basic example (ru)