Skip to content

Releases: sicara/easy-few-shot-learning

v1.5.0

25 Sep 08:34
51eb245
Compare
Choose a tag to compare

Incorporating Few-Shot Learning best practices into EasyFSL!

Feature centering and normalization

Centering and normalizing features after passing through the backbone but before inference improves performance by a couple of percent with almost every method and on all benchmarks, so we added the option to extensions of the FewShotClassifier base class.

classifier = PrototypicalNetworks(
	backbone=my_backbone,
	feature_centering=average_base_features,
	feature_normalization=2,
)

Hyperparameter search

We added the tools to perform hyperparameter selection in scripts/hyperparameter_search.py. You can now launch your own grid search, or reproduce ours!

Improved and extended results

We ran a new benchmark on _mini_ImageNet and _tiered_ImageNet with every available method using the best hyperparameters selected on _mini_ImageNet's validation set. You now have everything you need to select the best method for your use case!

v1.4.0

06 Jun 08:42
4822039
Compare
Choose a tag to compare

🤯 Huge release including 4 new methods, new standardized backbones and modules, improvements to existing methods and some datasets API, and numbered reproducible benchmarks on miniImageNet and tieredImageNet!

Incredible New Features

  • Add 4 new methods: SimpleShot, FEAT, LaplacianShot, PT-MAP
  • Add new modules:
    • MultiHeadAttention: default attention module used in FEAT
    • feat_resnet12: default backbone used in FEAT
    • utility feat_resnet12_from_checkpoint to load pre-trained weights for feat_resnet12
  • Add scripts to evaluate methods:
    • predict_embeddings to extract all embeddings from a dataset using a pre-trained backbone
    • benchmark_methods to evaluate a few-shot method on a dataset of pre-extracted features
    • config files to ease reproduction
  • Add result tables to README for miniImageNet and tieredImageNet
  • Add utilities:
    • Add a torch implementation of k_nearest_neighbours()
    • Add a strip_prefix() method that removes a specified prefix from the keys of an OrderectDict (can be used for torch state dicts)

Minor changes

  • TIM now uses cosine distance instead of L2 distance
  • Add parameterization of the softmax temperature in TransductiveFinetuning, Finetune, and TIM
  • Add some non-breaking changes to the datasets:
    • Add option image_file_extension to facilitate the use of the small version of the DanishFungi dataset
    • Make CUB and TieredImageNet extensions of EasySet instead of methods returning an instance of EasySet to improve typing consistency between datasets

Next steps

This release is a huge step because it's the first time we commit on reproducible evaluations of the Few-Shot Learning methods in EasyFSL. We still need to improve and augment these benchmarks.

  1. Complete benchmark with Matching and Relation Nets, PT-MAP, Transductive Finetuning (est. July 23)
  2. Add explicit hyperparameter selection (est. July 23)
  3. Add feature normalization as it's been proven to have a huge impact on the results (est. July 23)
  4. Add cross-domain benchmarks (CUB, Fungi) and using other backbones (est. September 23)

Any help is welcome!

v1.3.0

16 May 09:13
dca758a
Compare
Choose a tag to compare

New minor release with great tools to help you load all the data you want!

🥚 What's New

  • Add FeaturesDataset (#100): you can now use Few-Shot Learning models directly on embeddings!
    • FeaturesDataset can be initialized from embeddings in various forms (directly from tensors, from a dataframe or a dictionary)
    • FewShotClassifier models default is now to be initialized with a nn.Identity backbone: if you don't specify a backbone, it is equipped to work directly on features
    • New notebook to help understand how to use these new tools
  • Add WrapFewShotDataset (#101): any dataset can become a FewShotDataset and be used with EasyFSL
  • Added Python 3.11 support and removed Python 3.6 support (since this version is in end-of-life) (#99)
  • Tensor shapes are now explicit in docstrings (#102)

🗒️ Full Changelog: v1.2.1...v1.3.0

v1.2.1

30 Mar 10:22
Compare
Choose a tag to compare

🔧 Quickfix: make typing in TaskSampler compatible with MyPy.

v1.2.0

30 Mar 08:37
ae75fde
Compare
Choose a tag to compare

🎉 EasyFSL is now fully tested for Python 3.10!

There were also a lot of issues before related to TaskSampler being too permissive, causing downstream errors that were hard to understand. So now you get an explicit error with TaskSampler if you:

  • initialize it with a dataset that's too small (number of classes smaller than n_way, or number of instances in a class smaller than n_shot + n_query)
  • call TaskSampler.episodic_collate_fn() with input data that is not a (Tensor, int) or (Tensor, Tensor[0-dim int]) tuple.

Thanks to @Aml-Hassan-Abd-El-hamid for their good will and contribution.

v1.1.0

05 Sep 12:24
5baddc3
Compare
Choose a tag to compare

You can now create a support set from a file structure and easily feed it to Few-Shot Learning methods with SupportSetFolder. Thanks @diego91964 !

v1.0.1

07 Jun 17:50
278f0a6
Compare
Choose a tag to compare

There were some things to fix after the v1 release, so we fixed them:

  • EasySet's format check is now case unsensitive (thanks @mgmalana 😄 )
  • TaskSampler used to yield torch.Tensor objects which caused errors. So now it yields lists of integers, as is standard in PyTorch's interface.
  • When EasySet's initialization didn't find any images in the specified folders, it just built an empty dataset with no warning, which caused silent errors. Now EasySet.__init__() raises the following warning if no image is found: "No images found in the specified directories. The dataset will be empty"

v1.0.0

21 Mar 14:11
6e27907
Compare
Choose a tag to compare

🎂 Exactly 1 year after the first release of Easy FSL, we have one more year of experience in Few-Shot Learning research. We capitalize on this experience to make Easy FSL easier, cleaner, smarter.

No more episodic training logic inside Few-Shot Learning methods: you can train them however you want.
And more content! 4 additional methods; several ResNet architecture as they're often used in FSL research; and 4 ready-to-use datasets.

🗞️ What's New

  • Few-Shot Learning methods
  • Pre-designed ResNet architecutres for Few-Shot Learning
  • Most common few-shot classification datasets
    • _tiered_ImageNet
    • _mini_ImageNet
    • CU-Birds
    • Danish Fungi (not common but new, and really great)
    • And also an abstract class FewShotDataset to ease your developement or novel or modified datasets
  • Example notebooks to perform both episodic training and classical training for your Few-Shot Learning methods
  • Support Python 3.9

🔩 What's Changed

  • AbstractMetaLearner is renamed FewShotClassifier. All the episodic training logic has been removed from this class and moved to the example notebook episodic_training.ipynb
  • FewShotClassifier now supports non-cuda devices
  • FewShotClassifier can now be initialized with a backbone on GPU
  • Relation module in RelationNetworks can now be parameterized
  • Same for embedding modules in Matching Networks
  • Same for image preprocessing in pre-designed datasets like EasySet
  • EasySet now only collects image files

Full Changelog: v0.2.2...v1.0.0

v0.2.2

09 Nov 15:11
13221f9
Compare
Choose a tag to compare

Small fixes in EasySet and AbstractMetaLearner

  • Sort data instances for each class in EasySet

  • Add EasySet.number_of_classes()

  • Fix best validation accuracy update

  • Move switch to train mode inside fit_on_task()

  • Make AbstractMetaLearner.fit() return average loss

v0.2.1

22 Jun 08:32
afb3155
Compare
Choose a tag to compare

We fixed a bug that caused validation to not occur during training.