Skip to content

leowyy/autotune

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutoTune

How can we optimize the hyperparameters of a machine learning model automatically and efficiently?

Tuning a conv net on CIFAR10

Defining a problem instance

  1. Specify your hyperparameter domain in self.initialise_domain(), eg.
def initialise_domain(self):
    params = {
        'learning_rate': Param('learning_rate', -6, 0, distrib='uniform', scale='log', logbase=10),
        'weight_decay':  Param('weight_decay', -6, -1, distrib='uniform', scale='log', logbase=10),
        'momentum':      Param('momentum', 0.3, 0.9, distrib='uniform', scale='linear'),
        'batch_size':    Param('batch_size', 20, 2000, distrib='uniform', scale='linear', interval=1),
    }
    return params
  1. Specify your objective function in self.eval_arm(params, n_resources), eg.
def eval_arm(self, params, n_resources):
    model = generate_net(params)
    model.train(n_iter=n_resources)
    acc = model.test()
    return 1-acc
  1. Test your problem with the following code snippet:
problem = MyProblem()
params = problem.generate_arms(1)  # Draws a sample from the hyperparameter space
f_val = problem.eval_arm(params[0], 1)  # Evaluates the set of hyperparameters

References

  • Li et al (2016), Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
  • Bergstra et al (2011), Algorithms for Hyper-Parameter Optimization

Stay tuned for more updates...

About

Algorithms for hyperparameter optimization

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published