Skip to content

Releases: keras-team/keras-tuner

Release v1.1.2

25 Mar 05:32
Compare
Choose a tag to compare

What's Changed

  • add --profile=black to isort by @LukeWood in #672
  • In model checkpointing callback, check logs before get objective value by @haifeng-jin in #674

New Contributors

Full Changelog: 1.1.1...1.1.2

Release v1.1.2RC0

25 Mar 00:21
Compare
Choose a tag to compare
Release v1.1.2RC0 Pre-release
Pre-release

What's Changed

  • add --profile=black to isort by @LukeWood in #672
  • In model checkpointing callback, check logs before get objective value by @haifeng-jin in #674

New Contributors

Full Changelog: 1.1.1...1.1.2rc0

Release v1.1.1

20 Mar 04:32
Compare
Choose a tag to compare

Highlights

  • Support passing a list of objectives as the objective argument.
  • Raise better error message when the return value of run_trial() or HyperModel.fit() are of wrong type.
  • Various bug fixes for BayesianOptimization tuner.
  • The trial IDs are changed from hex strings to integers counting from 0.

What's Changed

  • Make hyperparameters names visible in Display output by @C-Pro in #634
  • Replace import kerastuner with import keras_tuner by @ageron in #640
  • Support multi-objective by @haifeng-jin in #641
  • reorganize the tests to follow keras best practices by @haifeng-jin in #643
  • keep Objective in oracle for backward compatibility by @haifeng-jin in #644
  • better error check for returned eval results by @haifeng-jin in #646
  • Mitigate the issue of hanging workers after chief already quits when running keras-tuner in distributed tuning mode. by @mtian29 in #645
  • Ensure hallucination checks if the Gaussian regressor has been fit be… by @brydon in #650
  • Resolves #609: Support for sklearn functions without sample_weight by @brydon in #651
  • Resolves #652 and #605: Make human readable trial_id and sync trial numbers between worker Displays by @brydon in #653
  • Update tuner.py by @haifeng-jin in #657
  • fix(bayesian): scalar optimization result (#655) by @haifeng-jin in #662
  • Generalize hallucination checks to avoid racing conditions by @alisterl in #664
  • remove scipy from required dependency by @haifeng-jin in #665
  • Import scipy.optimize by @haifeng-jin in #667

New Contributors

Full Changelog: 1.1.1rc0...1.1.1

Release v1.1.1RC0

01 Mar 06:39
Compare
Choose a tag to compare
Release v1.1.1RC0 Pre-release
Pre-release

Highlights

  • Support passing a list of objectives as the objective argument.
  • Raise better error message when the return value of run_trial() or HyperModel.fit() are of wrong type.

What's Changed

  • Make hyperparameters names visible in Display output by @C-Pro in #634
  • Replace import kerastuner with import keras_tuner by @ageron in #640
  • Support multi-objective by @haifeng-jin in #641
  • reorganize the tests to follow keras best practices by @haifeng-jin in #643
  • keep Objective in oracle for backward compatibility by @haifeng-jin in #644
  • better error check for returned eval results by @haifeng-jin in #646
  • Mitigate the issue of hanging workers after chief already quits when running keras-tuner in distributed tuning mode. by @mtian29 in #645
  • Ensure hallucination checks if the Gaussian regressor has been fit be… by @brydon in #650
  • Resolves #609: Support for sklearn functions without sample_weight by @brydon in #651
  • Resolves #652 and #605: Make human readable trial_id and sync trial numbers between worker Displays by @brydon in #653
  • Update tuner.py by @haifeng-jin in #657

New Contributors

Full Changelog: 1.1.0...1.1.1rc0

1.1.0

05 Nov 17:13
Compare
Choose a tag to compare

What's Changed

  • Support HyperModel.fit() to tune the fit process.
  • Support Tuner.run_trial() to return a single float as the objective value to minimize.
  • Support Tuner.run_trial() to return a dictionary of {metric_name: value} or Keras history.
  • Allow not providing hypermodel to Tuner if override Tuner.run_trial().
  • Allow not providing objective to Tuner if HyperModel.fit() or Tuner.run_trial() return a single float.
  • Bug fixes

Breaking Changes

  • Change internal class MultiExecutionTuner to Tuner to replace all overridden methods.
  • Removed KerasHyperModel an internal class to wrap the user provided HyperModel.

New Contributors

Full Changelog: 1.0.4...1.1.0rc0

1.1.0rc0

19 Oct 23:10
Compare
Choose a tag to compare
1.1.0rc0 Pre-release
Pre-release

What's Changed

  • Support HyperModel.fit() to tune the fit process.
  • Support Tuner.run_trial() to return a single float as the objective value to minimize.
  • Support Tuner.run_trial() to return a dictionary of {metric_name: value} or Keras history.
  • Allow not providing hypermodel to Tuner if override Tuner.run_trial().
  • Allow not providing objective to Tuner if HyperModel.fit() or Tuner.run_trial() return a single float.
  • Bug fixes

Breaking Changes

  • Change internal class MultiExecutionTuner to Tuner to replace all overridden methods.
  • Removed KerasHyperModel an internal class to wrap the user provided HyperModel.

New Contributors

Full Changelog: 1.0.4...1.1.0rc0

Release v1.0.4

25 Aug 22:03
baa0534
Compare
Choose a tag to compare
  • Support DataFrame in SklearnTuner.
  • Support Tuner.search_space_summary() to print all the hyperparameters based on conditional_scopes.
  • Support TensorFlow 2.0 for backward compatibility.
  • Bug fixes and documentation improvements.
  • Raise a warning when using with TF 1.
  • Save TPUStrategy models with the TF format.

Release v1.0.4rc1

24 Aug 06:05
Compare
Choose a tag to compare
Release v1.0.4rc1 Pre-release
Pre-release
  • Support DataFrame in SklearnTuner.
  • Support Tuner.search_space_summary() to print all the hyperparameters based on conditional_scopes.
  • Support TensorFlow 2.0 for backward compatibility.
  • Bug fixes and documentation improvements.
  • Raise a warning when using with TF 1.

Release v1.0.4rc0

15 Aug 05:39
Compare
Choose a tag to compare
Release v1.0.4rc0 Pre-release
Pre-release
  • Support DataFrame in SklearnTuner.
  • Support Tuner.search_space_summary() to print all the hyperparameters based on conditional_scopes.
  • Support TensorFlow 2.0 for backward compatibility.
  • Bug fixes and documentation improvements.

Release v1.0.3

17 Jun 21:29
e07ab22
Compare
Choose a tag to compare
  • Renamed import name of kerastuner to keras_tuner.
  • Renamed the Oracles to add the Oracle as suffix, e.g., RandomSearch oracle is renamed to RandomSearchOracle. (The RandomSearch tuner is still named RandomSearch.)
  • Renamed Tuner._populate_space to Tuner.populate_space.
  • Renamed Tuner._score_trail to Tuner.score_trial.
  • Renamed kt.tuners.Sklearn tuner to kt.SklearnTuner and put it at the root level of import.
  • Removed the CloudLogger feature, but the Logger class still works.
  • Tuning sklearn.pipeline.Pipeline.
  • Improved the docstrings.