This repository is dedicated to numerical optimization methods ✨.
All implemented methods are line search methods.
Implementations can be found in unconstrained/unconstrained.py
Optimization method based on following the direction of negative gradient.
Optimization method that uses second order information. Hessian matrix is estimated at every iteration. In case it isn't positive definite, a multiple of identity matrix is added.
Optimization method based on conjugacy of step directions.
The Polak–Ribiere
variation is implemented.
Update rule for a scalar that ensures conjugacy of step directions:
Optimization method that uses an approximation of second order information.
The BFGS
method is implemented.
The initial approximation of a Hessian matrix is an identity matrix.
$f(x) = 100(x_2-x_1^2)^2+(1-x_1)^2 \text{ || min at }x=(1,1)$ $f(x) = 150(x_1 x_2)^2+(0.5x_1+2x_2-2)^2 \text{ || min at }x=(0,1),(4,0)$
Starting points: (-0.2,1.2), (0,0), (-1,0)
- 50 points are generated on [-1,1] interval
approximation using
2nd
order polynomial - 100 points are generated on [-3,3] interval
approximation using
3nd
order polynomial
To see test results run python -m unconstrained.test_unconstrained
in the base directory.
Text output will be written to the unconstrained/test_unconstrained_results.txt
Graphical representation of the solutions for least square problems can be found at unconstrained/[optimization_method_name].png
(i.e. unconstrained/The Steepest Descent.png
)
Here you can find some details about additional implemented functions.
Central-difference formula was used
Float data type used: numpy.longdouble
On Windows with
On Linux (Ubuntu 18.04.2 LTS) with
Reason: on Windows the highest possible precision of float datatype is 15, on Linux - 18
Recommendation from the Book (p.197)
Another option: forward-difference
Recommendation from the Book
On Windows with
On Linux (Ubuntu 18.04.2 LTS) with
Used formula
Used technique - Backtracking Line Search
Line search algorithm for the Wolfe conditions can also be used [is preffered]
.
- 📖 "Numerical Optimization" by Jorge Nocedal and Stephen J. Wright
Link: https://www.math.uci.edu/~qnie/Publications/NumericalOptimization.pdf