Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of new algorithms into Metaheuristics.jl #48

Open
jbytecode opened this issue Oct 23, 2022 · 6 comments
Open

Implementation of new algorithms into Metaheuristics.jl #48

jbytecode opened this issue Oct 23, 2022 · 6 comments

Comments

@jbytecode
Copy link
Contributor

This journal paper can be followed for implementing new algorithms:

https://ieeexplore.ieee.org/abstract/document/9344597

After implementing some crucial methods, Metaheuristics.jl can also be used in a novel survey article that measures the performances of the algorithms using a set of well-known test functions.

@jmejia8
Copy link
Owner

jmejia8 commented Oct 23, 2022

That's a great idea. Let's work on it. I may start by suggesting:

@jbytecode
Copy link
Contributor Author

There is a huge set of works included in the literature and I think it is not that straightforward to end up with super-comprehensive comparisons with tons of algorithms.

What is possible?

  • Filtering the emergent algorithms or relatively new ones (maybe filtering by year?)
  • Multiple comparisons are hard, single comparisons require a reference algorithm to extend the idea of single comparison to multiple ones, the reference algorithm can be selected as the long run winner, e.g., ABC, WOA, or something else?
  • Selecting the test suite is not that problematic, a CECxx set can be selected, or creating a combined one is also possible.

These are my first thoughts after taking a look at the references you provided.

@jmejia8
Copy link
Owner

jmejia8 commented Nov 2, 2022

Regarding your last comments:

  • Filtering by year but also considering the popularity (citation rate) 🤔.
  • Regarding multiple comparisons, the Kruskal-Wallis rank sum test can be used to find statistical differences between multiple algorithms.
  • CEC test problems combined with a set of real-world problems could be a good option.

I may suggest the following Julia packages to perform this study:

Some questions:

  • Which metaheuristics will be implemented? Maybe those filtered in which the original author's code is available.
  • How to save the data (format)?
  • Will the benchmarking be performed alongside this repo? Maybe creating a new repo could be a good option.
  • How to set the parameter tuning?

@jbytecode
Copy link
Contributor Author

jbytecode commented Nov 2, 2022

  • Filtering by year but also considering the popularity (citation rate): Filtering by year and popularity is a better idea (but I am not sure how we would sort two dimensional points (year, popularity) objectively :)
  • Maybe, and interestingly, we can select only the new ones and reveal potential stars? How about considering the less known and new ones only?
  • The other option is to select some oldy and goldy ones with relatively new ones, e.g., GA, DE, and PSO with Newbee1, Newbee2, and Newbee3, where Newbee_i is a relatively new metaheuristics.
  • Multiple comparisons is okay, perhaps using the HypothesisTest.jl package?
  • HardTestProblems.jl includes vast majority of test functions and thank you for listing this software, too!

Do we really need a project assistant for this? I suggest using separate files for each test function. In those files we can iterate a function, say that it is Ackley, for different p (number of variables). The results can be simultaneously written in a csv file or an sqlite file. Ackley.sqlite would hold all of the information such like

p f algorithm etc...
5 0.002 Whale ...
5 0.004 Whale ...
5 0.029 Whale ...
... ........ ........... .......

for different p (number of parameters), algorithms, and algorithm configurations so we can later perform reports using SQL statements.

  • Parameter setting is a difficult one, what should be the popsize, crossover rate for GA, or, what should be the DE special values? Maybe the suggested parameters can be used in the corresponding original paper - or maybe even software package defaults.
  • Different parameter setting for several dimensionality, e.g., for p = 5, p = 10, p = 25, p = 50, different popsizes such as 50, 100, 150, 200? (numbers are in random).

@jmejia8
Copy link
Owner

jmejia8 commented Nov 2, 2022

I agree with you. IMO, we can answer the remaining questions once we are ready to begin the benchmarking.

Let's implement new metaheuristics.

My suggestions (listed in the EC-Bestiary with available source code):

  • Battle Royale Game: Farshi TR (2020). “Battle royale optimization algorithm.” Neural Computing and Applications, 33(4), 1139-1157. doi: 10.1007/s00521-020-05004-4 [code]
  • Black Widow: Hayyolalam V, Kazem AAP (2020). “Black Widow Optimization Algorithm: A novel meta-heuristic approach for solving engineering optimization problems.” Engineering Applications of Artificial Intelligence, 87, 103249. doi: 10.1016/j.engappai.2019.103249 [code]
  • Border Collie: Dutta T, Bhattacharyya S, Dey S, Platos J (2020). “Border Collie Optimization.” IEEE Access, 8, 109177-109197. doi: 10.1109/access.2020.2999540 [code]
  • Red Fox: Połap D, Woźniak M (2021). “Red fox optimization algorithm.” Expert Systems with Applications, 166, 114107. doi: 10.1016/j.eswa.2020.114107 [code]
  • Social Behavior: Urbanization: Ghasemian H, Ghasemian F, Vahdat-Nejad H (2020). “Human urbanization algorithm: A novel metaheuristic approach.” Mathematics and Computers in Simulation, 178, 1-15. doi: 10.1016/j.matcom.2020.05.023 [code]
  • Chameleons: Braik MS (2021). “Chameleon Swarm Algorithm: A bio-inspired optimizer for solving engineering design problems.” Expert Systems with Applications, 174, 114685. doi: 10.1016/j.eswa.2021.114685 [code]

Related software: mealpy a python library implementing some metaheuristics.

@jbytecode
Copy link
Contributor Author

I've just taken a look the code provided for the "Battle royale optimization algorithm". What is the best option? Directly converting the Matlab code into the corresponding Julia code? Or reading the papers and preparing our own implementations?

PS: I am not a good Matlab programmer nor ever used it. It seems the code uses an object of a soldier data type which is not implemented elsewhere.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants