Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ideas for v4.0 #12

Open
jmejia8 opened this issue Jan 31, 2022 · 11 comments
Open

Ideas for v4.0 #12

jmejia8 opened this issue Jan 31, 2022 · 11 comments

Comments

@jmejia8
Copy link
Owner

jmejia8 commented Jan 31, 2022

Some ideas for v4

Implement Flux-inspired algorithms
Example:

# algorithm
GA = Chain(tournament(), SBX(p=0.9), PM(p=0.1), Evironmental(base=:nsga2))


# usage
P = InitialPopulation(N=100)
GA(P) # produces a new population

## optimize
for gen in 1:100
    P = GA(P)
end
@fipelle
Copy link

fipelle commented May 19, 2022

Hi there,

I am not sure if it has already been implemented, but it would be nice to have an easy command to use relative convergence criteria - rather than absolute ones.

@jmejia8
Copy link
Owner Author

jmejia8 commented May 19, 2022

As far as I know, it can be difficult to implement a viable relative converge criteria into metaheuristics (due to stochastic operations of metaheuristics). It would be nice for me to know some relative convergence criteria designed for metaheuristics...

Maybe, you can use the diff_check stopping criteria already implemented in Metaheuristics.jl as a relative convergence criteria:

Example (Metaheuristics v3.2.6):

julia> using Metaheuristics

julia> f, bounds, _ = Metaheuristics.TestProblems.sphere();

julia> options = Options(f_tol=1e-8,seed=1); # give tolerance

julia> optimize(f, bounds, ECA(;options))
+=========== RESULT ==========+
  iteration: 228
    minimum: 2.72864e-09
  minimizer: [-1.404483151684814e-5, -1.645523586390722e-5, 1.9944474758572303e-5, -8.336438266970471e-6, 1.3988146389886503e-5, -1.3177919277964173e-5, -3.0124814894731768e-6, 1.6324625110249603e-5, -1.8097438702973058e-5, -2.865158742490321e-5]
    f calls: 15947
 total time: 0.1724 s
stop reason: Small difference of objective function values.
+============================+

@fipelle
Copy link

fipelle commented May 20, 2022

I have to admit that I haven't explored the full potential of the package, so there may be cases where my suggestion does not apply. I am thinking mostly about cases in which the scale of the parameters is not uniform and you may want to have a relative convergence criterion such as median(abs(x-x_last_best)/abs(x_last_best+eps())).

@jmejia8
Copy link
Owner Author

jmejia8 commented May 23, 2022

I got it. Could you share some bibliographic references on those relative convergence criteria (for metaheuristics)?

@fipelle
Copy link

fipelle commented May 23, 2022

Will do. For now take a look at this link for a few examples of relative convergence criteria https://www.sfu.ca/sasdoc/sashtml/iml/chap11/sect11.htm.

@renesw
Copy link

renesw commented Jun 14, 2022

Hi, within the performance metrics, it would be interesting to add the multiplicative unitary epsilon. How difficult would it be to add it?

@jmejia8
Copy link
Owner Author

jmejia8 commented Jun 19, 2022

@mtzrene I didn't know such performance indicator. Could you provide more info about it? A possible integration within Metaheuristics can be carried out if a paper and/or author's code (Julia or not) are available.

@renesw
Copy link

renesw commented Jun 21, 2022

@jmejia8 Of course, it is a performance criterion for multi-objective optimization. I looked for some code in another language to send you a reference, but was unsuccessful. However, it is mentioned in this article. https://ieeexplore.ieee.org/abstract/document/1197687

@jmejia8
Copy link
Owner Author

jmejia8 commented Jun 21, 2022

Thanks for the information, @mtzrene . I just read about the unary epsilon-indicator and it seems easy to implement. I have some questions about the multiplicative variant that we can discuss at issue #21

@pnovoa
Copy link

pnovoa commented Sep 23, 2022

Hi Mejias, great project! Congratulations! It would be nice in future implementations to have a variant of "optimize" that performs the optimization process with restarts. This naive strategy could be very useful in multimodal problems and in those where we need to perform continuous optimization process (e.g. dynamic problems). It could be a new function (e.g. "optimize_with_restart") or some parameter inside "Options".

Another issue that I think would help a lot is to have a overloaded "copy" for State, because it is complex to transfer data between instances of that type, that is, because of the amount of parameters that State contains.

@jmejia8
Copy link
Owner Author

jmejia8 commented Sep 27, 2022

@pnovoa Thank you for your comments and suggestions. Could you share with me some bibliographic references on approaches using resstarts? Perhaps, a restarting procedure could be implemented as a wrapper:

method = Restart(GA(), every = 20 #= generations =#)
optimize(f, bounds, method)

Suggestions are welcome 😀

Regarding the second issue, what would you expect the overloaded copy to do? Only copying the most important attributes?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants