Skip to content

enze5088/contrastive_learning_codes

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 

Repository files navigation

Contrastive Learning and Pre-trained model Papers&Codes

A list of Contrastive Learning and Pre-trained model Papers&Codes

Contrastive Learning

Natural Language Processing

Title Conference Codes Note
SimCSE: Simple Contrastive Learning of Sentence Embeddings EMNLP 2021 [Torch]
Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere ICML 2020 [Torch]
SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization ACL 2021 github
ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer ACL 2021 github
Contrastive Learning for Many-to-many Multilingual Neural Machine Translation ACL 2021 github
Distributed Representations of Words and Phrases and their Compositionality
An efficient framework for learning sentence representations ICLR 2018

Computer Vision

Title Conference Codes Note
MoCo - Momentum Contrast for Unsupervised Visual Representation Learning CVPR20 Torch(official)
MoCo v2 - Improved Baselines with Momentum Contrastive Learning Torch(official)
SimCLR - A Simple Framework for Contrastive Learning of Visual Representations ICML20 TF(official), Torch
SimCLR v2 - Big Self-Supervised Models are Strong Semi-Supervised Learners NIPS20 TF(official)
BYOL - Bootstrap your own latent: A new approach to self-supervised Learning JAX(official), Torch
SwAV - Unsupervised Learning of Visual Features by Contrasting Cluster Assignments NIPS20 Torch(official), Torch
SimSiam - Exploring Simple Siamese Representation Learning Torch
Prototypical Contrastive Learning of Unsupervised Representations ICLR2021 github

Other

Title Conference Codes Note
[HCL] - Contrastive Learning With Hard Negative Samples ICLR 2021 Link
A theoretical analysis of contrastive unsupervised representation learning ICML 2019
Contrastive Conditional Transport for Representation Learning
Mining Better Samples for Contrastive Learning of Temporal Correspondence CVPR 2020
Scaling Deep Contrastive Learning Batch Size
What Makes for Good Views for Contrastive Learning?
Contrastive Multiview Coding
Representation Learning with Contrastive Predictive Coding

Pre-trained Model

Title Conference Codes Note
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation

About

A list of Contrastive Learning Papers&Codes. Unsupervised: MoCo/SimCLR/SwAV/BYOL/SimSiam

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published