Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Opportunity to reduce redundant evaluation #442

Open
zkurtz opened this issue Sep 18, 2018 · 2 comments
Open

Opportunity to reduce redundant evaluation #442

zkurtz opened this issue Sep 18, 2018 · 2 comments

Comments

@zkurtz
Copy link

zkurtz commented Sep 18, 2018

In a simple example of a mixed space optimization with mostly-default parameters and a deterministic objective, mbo repeatedly evaluates the same point:

library(mlrMBO)
library(data.table)

fun = function(x) {
  if(x$method == "a") return(x$number)
  return(1)
}

space = makeParamSet(
  makeDiscreteParam("method", values = c("a", "b")),
  makeNumericParam("number", lower = 0,upper = 2 * pi, requires = quote(method == "a"))
)

obj = makeSingleObjectiveFunction(
  name = "mixed_example",
  fn = fun,
  par.set = space,
  has.simple.signature = FALSE,
  noisy = FALSE,
  minimize = TRUE
)

run = mbo(obj, control = makeMBOControl(), show.info = FALSE)

## ... many warnings of the form:
#           Warning in generateDesign(control$infill.opt.focussearch.points, ps.local,  :
#             generateDesign could only produce 1 points instead of 1000!

DT = as.data.table(run$opt.path)
head(DT[is.na(number), c("method", "number", "y")])

The result shows that one point got evaluated repeatedly, even though I set noisy = FALSE in the objective. Such repeated evaluation is costly in other settings -- is there any reason to allow it?

@jakob-r
Copy link
Sponsor Member

jakob-r commented Sep 18, 2018

This is indeed suboptimal (also because mbo never jumps to method = a) but also a very special case.
We have the setting ctrl = setMBOControlInfill(ctrl, filter.proposed.points = TRUE) which you can activate but apparently we have not implemented it for discrete parameters.
This would generate a random proposal when the proposed point equals an already evaluated point.
I think we should consider this for discrete parameters as an easy remedy.

Otherwise you could tweak the surrogate and the infill criterion settings. For a low number of discrete values dummy encoding and kriging often is a good choice.

lrn = makeLearner("regr.km", covtype = "matern3_2", optim.method = "gen", control = list(trace = FALSE), predict.type = "se")
lrn = makeDummyFeaturesWrapper(lrn)
lrn = makeImputeWrapper(lrn, classes = list(numeric = imputeMax(10), factor = imputeConstant("<missing>")))
lrn = makeRemoveConstantFeaturesWrapper(lrn)

Note that this will have problems with your given function because of the constant outcome for method = "b".

@jakob-r
Copy link
Sponsor Member

jakob-r commented Sep 18, 2018

Working on it in #444

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants