Skip to content

Releases: Optimization-AI/LibAUC

LibAUC 1.3.0

11 Jun 22:53
64dfc01
Compare
Choose a tag to compare

Introducing LibAUC 1.3.0

We are thrilled to release LibAUC 1.3.0! In this version, we have made improvements and brought new features to our library. We have released a new documentation website at https://docs.libauc.org/, where you can access our code and comments. We are also happy to announce that our LibAUC paper has been accepted by KDD2023!

Major Improvements

  • Improved the implementations for DualSampler and TriSampler for better efficiency.
  • Merged DataSampler for NDCGLoss with TriSampler and added a new string argument mode to switch between classification mode for multi-label classification and ranking mode for movie recommendations.
  • Improved AUCMLoss and included a new version v2 (required DualSampler) that removes the class prior p required in the previous version v1. To use different version, you can set version='v1' or version='v2' in AUCMLoss.
  • Improved CompositionalAUCLoss, which now allows multiple updates for optimizing inner loss by setting k in the loss. Similar to AUCMLoss, we introduced v2 version in this loss without using the class prior p. By default, k is 1 and version is v1.
  • Improved code quality for APLoss and pAUCLoss including pAUC_CVaR_Loss, pAUC_DRO_Loss, tpAUC_KL_Loss for better efficiency and readability.
  • API change for all optimizer methods. Please pass model.parameters() to the optimizer instead of model, e.g., PESG(model.parameters()).

New Features

  • Launched an official documentation site at http://docs.libauc.org/ to access source code and parameter information.
  • Introduced a new library logo for X-Risk designed by Zhuoning Yuan, Tianbao Yang .
  • Introduced MIDAM for multi-instance learning. It supports two pooling functions, MIDAMLoss('softmax') for using softmax pooling and MIDAMLoss('attention') for attention-based pooling.
  • Introduced a new GCLoss wrapper for contrastive self-supervised learning, which can be optimized by two algorithms in the backend: SogCLR and iSogCLR.
  • Introduced iSogCLR for automatic temperature individualization in self-supervised contrastive learning. To use iSogCLR, you can set GCLoss('unimodal', enable_isogclr=True) and GCLoss('bimodal', enable_isogclr=True).
  • Introduced three new multi-label losses: mAPLoss for optimizing mean AP, MultiLabelAUCMLoss for optimizing multi-label AUC loss, and MultiLabelpAUCLoss for multi-label partial AUC loss.
  • Introduced PairwiseAUCLoss to support optimization of traditional pairwise AUC losses.
  • Added more evaluation metrics: ndcg_at_k, map_at_k, precision_at_k, and recall_at_k.

Acknowledgment

Team: Zhuoning Yuan, Dixian Zhu, Zi-Hao Qiu, Gang Li, Tianbao Yang (Advisor)

Feedback

We value your thoughts and feedback! Please fill out this brief survey to guide our future developments. Thank you for your time! For other questions, please contact us @ Zhuoning Yuan [yzhuoning@gmail.com] and Tianbao Yang [tianbao-yang@tamu.edu].

v1.2.0

31 Jul 03:36
12ba07b
Compare
Choose a tag to compare

What's New

We continuously update our library by making improvements and adding new features. If you use or like our library, please star⭐ this repo. Thank you!

Major Improvements

  • In this version,AUCMLoss can automatically compute imratio without requiring this input from users.
  • Renamed gamma to epoch_decay for PESG and PDSCA optimizers, i.e., epoch_decay = 1/gamma
  • Reimplemented ImbalancedDataGenerator for constructing imbalanced dataset for benchmarking. Tutorial is available here.
  • Improved implementations of APLoss by removing some redundant computations.
  • Merged SOAP_ADAM and SOAP_SGD optimizers into one optimizer SOAP. Tutorial is provided here.
  • Removed dependency of TensorFlow and now LibAUC only requires PyTorch installed .
  • Updated existing tutorials to match the new version of LibAUC. Tutorials are available here.

New Features

  • Introduced DualSampler, TriSampler for sampling data that best fit the x-risk optimization to balance inner and outer estimation error.
  • Introduced CompositionAUCLoss and PDSCA optimizer. Tutorial is provided here.
  • Introduced SogCLR with Dynamic Contrastive Loss for training Self-Supervised Learning models using small batch size. Tutorial and code are provided here.
  • Introduced NDCG_Loss and SONG optimizer for optimizing NDCG. Tutorials are provided here.
  • Introduced pAUCLoss with three optimizers: SOPA, SOPAs, SOTAs for optimizing Partial AUROC. Tutorials are provided here.
  • Added three evaluation functions: auc_roc_score (binary/multi-task), auc_prc_score (binary/multi-task) and pauc_roc_score(binary).

Feedback

LibAUC v1.1.8

11 Jan 04:21
38bd42d
Compare
Choose a tag to compare

What's New

  • Fixed some bugs and improved the training stability

LibAUC v1.1.6

11 Nov 17:14
7a30153
Compare
Choose a tag to compare

What's New

  • Added Support for Multi-Label Training. Tutorial for training CheXpert is available here!
  • Fixed some bugs and improved the training stability