Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Refactoring] get_loss/cost/learning_curve() #78

Open
eddiebergman opened this issue Apr 30, 2024 · 0 comments
Open

[Refactoring] get_loss/cost/learning_curve() #78

eddiebergman opened this issue Apr 30, 2024 · 0 comments

Comments

@eddiebergman
Copy link
Contributor

While doing some small migrations to ruff as part of #77, I noticed that get_loss/cost/learning_curve() are always done on a ConfigResult object, i.e. get_loss(config_result.result, ...). I will just move these to be methods directly on the config object instead of round-abouting to the aforementioned standalone functions. This means any consumers of ConfigResult can just do config_result.loss(...). This is also hiding behind one more abstraction in the BaseOptimizer which essentially wraps these methods with the extra parameters ... filled in.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant