Skip to content

optimizationBenchmarking/optimizationBenchmarking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Travis CI Build Status Codeship Build Status CircleCI Build Status Semaphore Build Status Snap CI Build Status

Please visit our website to download the latest release in form of a stand-alone executable. A set of slides with examples and descriptions how to use the system can be found here.

This is the main source repository of the optimizationBenchmarking.org tool suite. The optimizationBenchmarking.org tool suite supports researchers in evaluating and comparing the performance of (anytime) optimization algorithms, such as [Local Search] (http://en.wikipedia.org/wiki/Local_search_%28optimization%29), Evolutionary Algorithms, Swarm Intelligence methods, Branch and Bound, and virtually all other metaheuristics.

System Requirements

  1. Java 1.7: Ideally a JDK, because under a JRE, the software is both slower and needs more memory
  2. optional: a LaTeX installation such as MikTeX or TexLive
  3. the third party libraries optimizationBenchmarking depends on if and only if you do not use the stand-alone/full executable

Optimization and Anytime Algorithms

Optimization algorithms are algorithms which can find (approximate) solutions for computationally hard (e.g., NP-hard) problems, such as the Traveling Salesman Problem, the Maximum Satisfiability Problem, or the Bin Packing Problem. For this kind of problems, solvers cannot guarantee to always find the globally best possible solution within feasible time. In order to solve these problems, solution quality has to be traded in for shorter runtime.

Anytime optimization algorithms do this by starting with a more or less random (and hence usually bad) approximation of the solution and improve this approximation during their course. Comparing two such algorithms is not an easy thing, since it involves comparing behavior over runtime.

In this project, we try to provide a set of tools to make this process easier. The currently available tool can load log files (with rows of, e.g., the form consumed-runtime, best-solution-found) and render performance reports in a variety of different formats, including LaTeX and XHTML. These reports contain performance metrics and comparisons carried out according to a user-provided specification.

Examples

We provide a set of examples which can be executed directly on the command lines of your Linux or Windows machine, given that you have Java 1.7 installed (and potentially svn and a LaTeX installation). No further installation or downloads are required, nothing will be installed (just copied into the current folder).

  1. Comparison of Algorithms of Maximum Satisfiability Problem: [Linux] [Windows]
  2. Comparison of Some Algorithms from BBOB'2013: [Linux] [Windows]
  3. Comparison of Some Algorithms for the TSP: [Linux] [Windows]

About

The main repository of the optimization benchmarking code.

Resources

License

Stars

Watchers

Forks

Packages

No packages published