Skip to content

Commit

Permalink
Pkm/depfminbox (#446)
Browse files Browse the repository at this point in the history
* Properly deprecate Fminbox.

* Fix Fminbox docs.
  • Loading branch information
pkofod committed Jul 23, 2017
1 parent ae86202 commit bb5cc12
Show file tree
Hide file tree
Showing 2 changed files with 52 additions and 5 deletions.
8 changes: 4 additions & 4 deletions docs/src/user/minimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,22 +70,22 @@ A primal interior-point algorithm for simple "box" constraints (lower and upper
lower = [1.25, -2.1]
upper = [Inf, Inf]
initial_x = [2.0, 2.0]
results = optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox(), optimizer = GradientDescent)
results = optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox{GradientDescent}())
```

This performs optimization with a barrier penalty, successively scaling down the barrier coefficient and using the chosen `optimizer` for convergence at each step. Notice that the `Optimizer` type, not an instance should be passed. This means that the keyword should be passed as `optimizer = GradientDescent` not `optimizer = GradientDescent()`, as you usually would.
This performs optimization with a barrier penalty, successively scaling down the barrier coefficient and using the chosen `optimizer` (`GradientDescent` above) for convergence at each step. Notice that the `Optimizer` type, not an instance should be passed (`GradientDescent`, not `GradientDescent()`).

This algorithm uses diagonal preconditioning to improve the accuracy, and hence is a good example of how to use `ConjugateGradient` or `LBFGS` with preconditioning. Other methods will currently not use preconditioning. Only the box constraints are used. If you can analytically compute the diagonal of the Hessian of your objective function, you may want to consider writing your own preconditioner.

There are two iterations parameters: an outer iterations parameter used to control `Fminbox` and an inner iterations parameter used to control the inner optimizer. For this reason, the options syntax is a bit different from the rest of the package. All parameters regarding the outer iterations are passed as keyword arguments, and options for the interior optimizer is passed as an `Optim.Options` type using the keyword `optimizer_o`.

For example, the following restricts the optimization to 2 major iterations
```julia
results = optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox(); optimizer = GradientDescent, iterations = 2)
results = optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox{GradientDescent}(); iterations = 2)
```
In contrast, the following sets the maximum number of iterations for each `ConjugateGradient` optimization to 2
```julia
results = Optim.optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox(); optimizer = GradientDescent, optimizer_o = Optim.Options(iterations = 2))
results = Optim.optimize(OnceDifferentiable(f, g!), initial_x, lower, upper, Fminbox{GradientDescent}(); optimizer_o = Optim.Options(iterations = 2))
```
## Minimizing a univariate function on a bounded interval

Expand Down
49 changes: 48 additions & 1 deletion src/deprecate.jl
Original file line number Diff line number Diff line change
@@ -1 +1,48 @@
Base.@deprecate method(x) summary(x)
Base.@deprecate method(x) summary(x)

const has_deprecated_fminbox = Ref(false)
function optimize{T<:AbstractFloat}(
df::OnceDifferentiable,
initial_x::Array{T},
l::Array{T},
u::Array{T},
::Type{Fminbox};
x_tol::T = eps(T),
f_tol::T = sqrt(eps(T)),
g_tol::T = sqrt(eps(T)),
allow_f_increases::Bool = true,
iterations::Integer = 1_000,
store_trace::Bool = false,
show_trace::Bool = false,
extended_trace::Bool = false,
callback = nothing,
show_every::Integer = 1,
linesearch = LineSearches.HagerZhang(),
eta::Real = convert(T,0.4),
mu0::T = convert(T, NaN),
mufactor::T = convert(T, 0.001),
precondprep = (P, x, l, u, mu) -> precondprepbox!(P, x, l, u, mu),
optimizer = ConjugateGradient,
optimizer_o = Options(store_trace = store_trace,
show_trace = show_trace,
extended_trace = extended_trace),
nargs...)
if !has_deprecated_fminbox[]
warn("Fminbox with the optimizer keyword is deprecated, construct Fminbox{optimizer}() and pass it to optimize(...) instead.")
has_deprecated_fminbox[] = true
end
optimize(df, initial_x, l, u, Fminbox{optimizer}();
allow_f_increases=allow_f_increases,
iterations=iterations,
store_trace=store_trace,
show_trace=show_trace,
extended_trace=extended_trace,
show_every=show_every,
callback=callback,
linesearch=linesearch,
eta=eta,
mu0=mu0,
mufactor=mufactor,
precondprep=precondprep,
optimizer_o=optimizer_o)
end

0 comments on commit bb5cc12

Please sign in to comment.