You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While doing some small migrations to ruff as part of #77, I noticed that get_loss/cost/learning_curve() are always done on a ConfigResult object, i.e. get_loss(config_result.result, ...). I will just move these to be methods directly on the config object instead of round-abouting to the aforementioned standalone functions. This means any consumers of ConfigResult can just do config_result.loss(...). This is also hiding behind one more abstraction in the BaseOptimizer which essentially wraps these methods with the extra parameters ... filled in.
The text was updated successfully, but these errors were encountered:
While doing some small migrations to ruff as part of #77, I noticed that
get_loss/cost/learning_curve()
are always done on aConfigResult
object, i.e.get_loss(config_result.result, ...)
. I will just move these to be methods directly on the config object instead of round-abouting to the aforementioned standalone functions. This means any consumers ofConfigResult
can just doconfig_result.loss(...)
. This is also hiding behind one more abstraction in theBaseOptimizer
which essentially wraps these methods with the extra parameters...
filled in.The text was updated successfully, but these errors were encountered: