directsearch
is a package for solving unconstrained minimization, without requiring derivatives of the objective. It is particularly useful when evaluations of the objective function are expensive and/or noisy.
It implements a family of direct search methods. For general references on these methods, please consult:
- A R Conn, K Scheinberg, and L N Vicente. Introduction to derivative-free optimization. SIAM, 2009.
- C Audet, and W. Hare. Derivative-Free and Blackbox Optimization. Springer, 2017.
- T G Kolda, R M Lewis, and V Torczon. Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods. SIAM Review, 45(3), 2003, 385-482.
This package extends general direct search methods to use randomized methods for improved practical performance and scalability.
If you use this package, please cite:
L Roberts, and C W Royer. Direct search based on probabilistic descent in reduced spaces, SIAM J. Optim., 33(4):3057-3082, 2023.
Please install using pip:
$ pip install [--user] directsearch
To instead install from source run:
$ git clone git@github.com:lindonroberts/directsearch.git
$ cd directsearch
$ pip install -e .
The -e
option to pip allows you to modify the source code and for your Python installation to recognize this.
This package can solve unconstrained nonlinear optimization problems of the form: min_{x in R^n} f(x)
. The simplest usage of directsearch
is
soln = directsearch.solve(f, x0)
where
f
is a callable objective function, taking in anumpy.ndarray
the same shape asx0
and returing a singlefloat
.x0
is a one-dimensionalnumpy.ndarray
(i.e.len(x0.shape)==1
), the starting point for the algorithm. It should be the best available guess of the minimizer.
The output is an object with fields:
soln.x
: the approximate minimizer, the bestx
value found (anumpy.ndarray
the same shape asx0
).soln.f
: the minimum value equal tof(soln.x)
.soln.nf
: the number of evaluations off
required by the solve routine.soln.flag
: an integer indicating the reason for termination.soln.msg
: a string with a human-readable termination message.
The possible values of soln.flag
are:
soln.EXIT_MAXFUN_REACHED
: termination on maximum number of objective evaluations.soln.EXIT_ALPHA_MIN_REACHED
: termination on small step size (success).
You can print information about the solution using print(soln)
. The examples
directory has several scripts showing the usage of directsearch
.
Interfaces to solver instances
There are many configurable options for the solver in directsearch and several ways to call specific direct search algorithm implementations. The full set of available functions is:
directsearch.solve()
applies a direct-search method to a given optimization problem. It is the most flexible available routine.directsearch.solve_directsearch()
applies regular direct-search techniques without sketching [1,2,3].directsearch.solve_probabilistic_directsearch()
applies direct search based on probabilistic descent without sketching [4].directsearch.solve_subspace_directsearch()
applies direct-search schemes based on polling directions in random subspaces [5].directsearch.solve_stp()
applies the stochastic three points method, a particular direct-search technique [6].
Optional parameters and more information
See usage.txt
for full details on how to call these functions. The most commonly used optional inputs (to all functions) are:
maxevals
: the maximum number of allowed evaluations off
during the solve.verbose
: abool
for whether or not to print progress information.print_freq
: anint
indicating how frequently to print progress information (1 is at every iteration).
Choosing a solver instance
As a rule of thumb, if len(x0)
is not too large (e.g. less than 50), then solve_directsearch()
or solve_probabilistic_directsearch()
are suitable choices. Of these, generally solve_probabilistic_directsearch()
will solve with fewer evaluations of f
, but solve_directsearch()
is a deterministic algorithm. If len(x0)
is larger, then directsearch.solve_subspace_directsearch()
may be a better option. Note that solve_directsearch()
is the only deterministic algorithm (i.e. reproducible without setting the numpy random seed).
References
- A R Conn, K Scheinberg, and L N Vicente. Introduction to derivative-free optimization. SIAM, 2009.
- C Audet, and W. Hare. Derivative-Free and Blackbox Optimization. Springer, 2017.
- T G Kolda, R M Lewis, and V Torczon. Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods. SIAM Review, 45(3), 2003, 385-482.
- S Gratton, C W Royer, L N Vicente, and Z Zhang. Direct Search Based on Probabilistic Descent. SIAM J. Optimization, 25(3), 2015, 1515-1541.
- L Roberts, and C W Royer. Direct search based on probabilistic descent in reduced spaces, SIAM J. Optimization, 33(4), 2023, 3057-3082.
- E H Bergou, E Gorbunov, and P Richtarik. Stochastic Three Points Method for Unconstrained Smooth Minimization. SIAM J. Optimization, 30(4), 2020, 2726-2749.
Please report any bugs using GitHub's issue tracker.
This algorithm is released under the GNU GPL license.