Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Boosting FW #212

Draft
wants to merge 14 commits into
base: master
Choose a base branch
from
Draft

[WIP] Boosting FW #212

wants to merge 14 commits into from

Conversation

mattplo
Copy link

@mattplo mattplo commented Jul 7, 2021

No description provided.

src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
src/oracles.jl Outdated Show resolved Hide resolved
@mattplo mattplo closed this Jul 21, 2021
@mattplo mattplo reopened this Jul 21, 2021
@codecov-commenter
Copy link

codecov-commenter commented Jul 21, 2021

Codecov Report

Merging #212 (5bf00af) into master (8d8e734) will increase coverage by 0.29%.
The diff coverage is 79.48%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #212      +/-   ##
==========================================
+ Coverage   68.85%   69.14%   +0.29%     
==========================================
  Files          14       14              
  Lines        1692     1747      +55     
==========================================
+ Hits         1165     1208      +43     
- Misses        527      539      +12     
Impacted Files Coverage Δ
src/FrankWolfe.jl 100.00% <ø> (ø)
src/afw.jl 74.25% <66.66%> (ø)
src/oracles.jl 83.00% <77.58%> (-3.73%) ⬇️
src/blended_cg.jl 50.93% <100.00%> (ø)
src/fw_algorithms.jl 76.28% <100.00%> (ø)
src/simplex_oracles.jl 92.30% <100.00%> (ø)
src/norm_oracles.jl 97.36% <0.00%> (+1.31%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8d8e734...5bf00af. Read the comment docs.

@mattplo
Copy link
Author

mattplo commented Aug 6, 2021

Benchmark calls to compute_extreme_point with ChasingGradientLMO

Profile compute_extreme_point

Compare Vanilla FW and Boosting FW (capped to 1 round) for several LMOs

lmo_inner = LpNormLMO{Float64, 1}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.332871e-05  -2.667877e-05   4.000748e-05   0.000000e+00            NaN
    FW           250   1.323888e-05  -2.664119e-05   3.988007e-05   1.586719e+00   1.575578e+02
    FW           500   1.314963e-05  -2.658990e-05   3.973953e-05   2.982284e+00   1.676567e+02
    FW           750   1.306098e-05  -2.654322e-05   3.960420e-05   4.409147e+00   1.701009e+02
    FW          1000   1.297290e-05  -2.651077e-05   3.948366e-05   5.889026e+00   1.698074e+02
  Last          1000   1.297259e-05  -2.651068e-05   3.948326e-05   5.901882e+00   1.696069e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.332871e-05  -2.667877e-05   4.000748e-05   0.000000e+00            NaN
    FW           250   1.323888e-05  -2.664119e-05   3.988007e-05   1.095984e+00   2.281055e+02
    FW           500   1.314963e-05  -2.658990e-05   3.973953e-05   2.092250e+00   2.389772e+02
    FW           750   1.306098e-05  -2.654322e-05   3.960420e-05   3.003983e+00   2.496685e+02
    FW          1000   1.297290e-05  -2.651077e-05   3.948366e-05   3.982501e+00   2.510985e+02
  Last          1000   1.297259e-05  -2.651068e-05   3.948326e-05   3.990126e+00   2.508693e+02
-------------------------------------------------------------------------------------------------

lmo_inner = LpNormLMO{Float64, 2}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   9.936888e-01  -2.993665e+00   3.987354e+00   0.000000e+00            NaN
    FW           250   6.233151e-31   1.501500e-15  -1.501500e-15   2.007539e+00   1.245306e+02
  Last           250   6.233151e-31   1.501500e-15  -1.501500e-15   2.024618e+00   1.239740e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   9.936888e-01  -2.993665e+00   3.987354e+00   0.000000e+00            NaN
    FW           250   6.233151e-31   1.501500e-15  -1.501500e-15   3.896185e+00   6.416533e+01
  Last           250   6.233151e-31   1.501500e-15  -1.501500e-15   3.927115e+00   6.391460e+01
-------------------------------------------------------------------------------------------------

lmo_inner = FrankWolfe.ProbabilitySimplexOracle{Float64}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.000001e+00  -1.000027e+00   2.000028e+00   0.000000e+00            NaN
    FW           250   4.406669e-03  -4.459474e-03   8.866142e-03   1.617583e+00   1.545516e+02
    FW           500   2.192075e-03  -2.244912e-03   4.436987e-03   3.306393e+00   1.512222e+02
    FW           750   1.452857e-03  -1.505618e-03   2.958475e-03   4.919544e+00   1.524532e+02
    FW          1000   1.083092e-03  -1.135762e-03   2.218854e-03   6.411167e+00   1.559778e+02
  Last          1000   1.082104e-03  -1.134774e-03   2.216878e-03   6.425447e+00   1.557868e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 1000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.000001e+00  -1.000027e+00   2.000028e+00   0.000000e+00            NaN
    FW           250   4.406669e-03  -4.459474e-03   8.866142e-03   9.377320e-01   2.666007e+02
    FW           500   2.192075e-03  -2.244912e-03   4.436987e-03   1.964084e+00   2.545716e+02
    FW           750   1.452857e-03  -1.505618e-03   2.958475e-03   2.958777e+00   2.534831e+02
    FW          1000   1.083092e-03  -1.135762e-03   2.218854e-03   3.886405e+00   2.573072e+02
  Last          1000   1.082104e-03  -1.134774e-03   2.216878e-03   3.893230e+00   2.571130e+02
-------------------------------------------------------------------------------------------------

Capped to 100 rounds


lmo_inner = LpNormLMO{Float64, 1}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 2000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.332871e-05  -2.667877e-05   4.000748e-05   0.000000e+00            NaN
    FW           250   1.323888e-05  -2.664119e-05   3.988007e-05   1.642726e+00   1.521861e+02
    FW           500   1.314963e-05  -2.658990e-05   3.973953e-05   3.039218e+00   1.645160e+02
    FW           750   1.306098e-05  -2.654322e-05   3.960420e-05   4.440693e+00   1.688926e+02
    FW          1000   1.297290e-05  -2.651077e-05   3.948366e-05   5.856483e+00   1.707510e+02
    FW          1250   1.288535e-05  -2.646685e-05   3.935219e-05   7.298466e+00   1.712689e+02
    FW          1500   1.279838e-05  -2.640961e-05   3.920799e-05   8.731998e+00   1.717820e+02
    FW          1750   1.271202e-05  -2.635734e-05   3.906936e-05   1.016585e+01   1.721450e+02
    FW          2000   1.262626e-05  -2.630209e-05   3.892835e-05   1.165952e+01   1.715337e+02
  Last          2000   1.262596e-05  -2.630116e-05   3.892712e-05   1.167184e+01   1.714383e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 250 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.332871e-05  -2.667773e-05   4.000644e-05   0.000000e+00            NaN
    FW            50   1.319334e-05  -2.661661e-05   3.980994e-05   5.836027e-01   8.567473e+01
    FW           100   1.305697e-05  -2.653844e-05   3.959541e-05   1.054344e+00   9.484572e+01
    FW           150   1.291749e-05  -2.648093e-05   3.939841e-05   1.570744e+00   9.549613e+01
    FW           200   1.278891e-05  -2.640150e-05   3.919041e-05   2.080849e+00   9.611462e+01
    FW           250   1.266568e-05  -2.633374e-05   3.899943e-05   2.593901e+00   9.637993e+01
  Last           250   1.266446e-05  -2.633080e-05   3.899527e-05   2.612086e+00   9.609178e+01
-------------------------------------------------------------------------------------------------

lmo_inner = LpNormLMO{Float64, 2}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 2000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   9.936888e-01  -2.993665e+00   3.987354e+00   0.000000e+00            NaN
    FW           250   6.233151e-31   1.501500e-15  -1.501500e-15   1.999824e+00   1.250110e+02
  Last           250   6.233151e-31   1.501500e-15  -1.501500e-15   2.015370e+00   1.245429e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 250 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   9.936888e-01  -2.993665e+00   3.987354e+00   0.000000e+00            NaN
    FW            50   6.233151e-31   1.501500e-15  -1.501500e-15   9.438202e-01   5.297619e+01
  Last            50   6.233151e-31   1.501500e-15  -1.501500e-15   9.823848e-01   5.191448e+01
-------------------------------------------------------------------------------------------------

lmo_inner = FrankWolfe.ProbabilitySimplexOracle{Float64}(1.0)
-------- Vanilla FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 2000 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Nothing
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.000001e+00  -1.000027e+00   2.000028e+00   0.000000e+00            NaN
    FW           250   4.406669e-03  -4.459474e-03   8.866142e-03   1.502652e+00   1.663725e+02
    FW           500   2.192075e-03  -2.244912e-03   4.436987e-03   3.038810e+00   1.645381e+02
    FW           750   1.452857e-03  -1.505618e-03   2.958475e-03   4.630816e+00   1.619585e+02
    FW          1000   1.083092e-03  -1.135762e-03   2.218854e-03   6.141378e+00   1.628299e+02
    FW          1250   8.611958e-04  -9.137522e-04   1.774948e-03   7.579845e+00   1.649110e+02
    FW          1500   7.132598e-04  -7.656822e-04   1.478942e-03   9.030188e+00   1.661095e+02
    FW          1750   6.075959e-04  -6.598846e-04   1.267481e-03   1.061456e+01   1.648678e+02
    FW          2000   5.283549e-04  -5.805046e-04   1.108859e-03   1.212111e+01   1.650015e+02
  Last          2000   5.281077e-04  -5.802562e-04   1.108364e-03   1.213293e+01   1.649230e+02
-------------------------------------------------------------------------------------------------

-------- Boosting FW

Vanilla Frank-Wolfe Algorithm.
EMPHASIS: memory STEPSIZE: Adaptive EPSILON: 1.0e-7 MAXITERATION: 250 TYPE: Float64
MOMENTUM: nothing GRADIENTTYPE: Vector{Float64}
WARNING: In memory emphasis mode iterates are written back into x0!

-------------------------------------------------------------------------------------------------
  Type     Iteration         Primal           Dual       Dual Gap           Time         It/sec
-------------------------------------------------------------------------------------------------
     I             0   1.000001e+00  -1.000027e+00   2.000028e+00   0.000000e+00            NaN
    FW            50   2.487741e-04  -2.997027e-04   5.484768e-04   4.802195e+00   1.041191e+01
    FW           100   1.366335e-04  -1.859632e-04   3.225967e-04   9.551926e+00   1.046909e+01
    FW           150   9.923454e-05  -1.473400e-04   2.465746e-04   1.343974e+01   1.116093e+01
    FW           200   7.915443e-05  -1.262317e-04   2.053861e-04   1.689208e+01   1.183987e+01
    FW           250   6.621227e-05  -1.123438e-04   1.785561e-04   2.015107e+01   1.240629e+01
  Last           250   6.602084e-05  -1.121366e-04   1.781574e-04   2.028151e+01   1.237580e+01
-------------------------------------------------------------------------------------------------

@matbesancon
Copy link
Member

Looking at it, the most promising improvement I see:
we are operating on

d = sum lambda_k (v_k-x)

We could instead operate on:

d = sum lambda_k v_k

and then when needed use:

d + x * sum lambda_k

The reasons are:

  1. this makes d sparse, easier to perform certain operations
  2. we don't need to do the final re-addition += x

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants