Awesome Knowledge Distillation
-
Updated
May 23, 2024
Awesome Knowledge Distillation
This repository holds the implementation, for the thesis project: Improving Collaborative Filtering Techniques by the use of Co-Training in Recommender Systems. This project was done under a thesis research for Fernando Benjamín Pérez Maurera, under supervision of Professor Paolo Cremonesi and Engineer Maurizio Ferrari, at Politecnico di Milano.
This is the public repository of EMNLP 2023 paper "DisCo: Co-training Distilled Student Models for Semi-supervised Text Mining"
Code and model for "Multi-dataset Training of Transformers for Robust Action Recognition", NeurIPS 2022 Spotlight
[Nature Machine Intelligence Journal] Official pytorch implementation for Uncertainty-Guided Dual-Views for Semi-Supervised Volumetric Medical Image Segmentation
end of 2nd year of engineering project
Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.
Official Code for our Findings of ACL 2022 paper: Co-training an Unsupervised Constituency Parser with Weak Supervision
Large-scale semi-supervised annotation (self-learning, co-training) for text (code for papers @ KDD17, @ KAIS19); Repository maintained by Vasileios Iosifidis.
[MICCAI2021] This is an official PyTorch implementation for "Duo-SegNet: Adversarial Dual-Views for Semi-Supervised Medical Image Segmentation"
Semisupervised classification methods (SSC) with Spark-ML, study and implementation
Empower Sequence Labeling with Task-Aware Neural Language Model | a PyTorch Tutorial to Sequence Labeling
Audio Classification Kaggle
Data analytics project in R to create a predictive model for the detection of objectivity in sports articles, based on co-training.
Add a description, image, and links to the co-training topic page so that developers can more easily learn about it.
To associate your repository with the co-training topic, visit your repo's landing page and select "manage topics."