Skip to content

YW81/Learning-the-kernel-hyperparameters

 
 

Repository files navigation

Learning the kernel hyperparameters continuously

This algorithm learns convex combinations of continuously parameterized kernels. For example, we can select a kernel among the set of Gaussian kernels whose parameter lies in a given interval.

Like multiple kernel learning, this approach is an alternative to standard hyperparameter tuning by (cross)validation. Thus it requires only the training set and no validation set. In this setting, tuning the kernel parameters is viewed as learning a convex combination of infinite, continuously parameterized kernels. The algorithm is detailed in Learning Convex Combinations of Continuously Parameterized Basic Kernels and A DC-Programming Algorithm for Kernel Selection.

The resulting optimization problem is not convex, in general. However, in certain cases of interest (such as Gaussian kernels), a DC1 decomposition of the problem can be readily obtained. The algorithm implemented comes from the area of DC optimization and is tractable for a limited number of kernel parameters.

To run the experiments in A DC-Programming Algorithm for Kernel Selection, execute the digit_runs*.m scripts (after downloading and processing the MNIST data under data/ as in the paper).

1Difference of convex functions.

About

Learning the kernel hyperparameters

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • MATLAB 75.2%
  • C 24.8%