Skip to content

Latest commit

 

History

History
23 lines (20 loc) · 1.21 KB

README.md

File metadata and controls

23 lines (20 loc) · 1.21 KB

Performance of Nerual Tangent Kernel (NTK) on UCI datasets

This is code for the UCI experiment in paper "Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks"

Prerequisites

Python3, numpy, sklearn

Setup

Download and decompress the pre-processed datasets used in paper "Do we need hundreds of classifiers to solve real world classification problems?" by running

bash setup.sh

Running the tests

python UCI.py -max_tot N -max_dep dep -file output_file

Use option -max_tot N to skip datasets with size larger than N.

Use option -max_dep dep to set the maximum depth allowed for NTK.

Use option -file output_file to set the output file.

Comparison

Compare with other classifiers using results reported by "Do we need hundreds of classifiers to solve real world classification problems?" from the link blow:

Details are listed in paper "Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks".