PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
-
Updated
Feb 24, 2024
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
BETO - Spanish version of the BERT model
DIET Classifier mini implementation on pytorch.
Minimal example of using a traced huggingface transformers model with libtorch
"Open Source Models with Hugging Face" course empowers you with the skills to leverage open-source models from the Hugging Face Hub for various tasks in NLP, audio, image, and multimodal domains.
Contextual Emotion Detection in Text (DoubleDistilBert Model)
Sentential Semantic Similarity measurement library using BERT Embeddings for spatial distance evaluation.
Training a BERT model from scratch.
AllenNLP integration for Shiba: Japanese CANINE model
Multi-Label Text Classification by fine-tuning BERT and XLNet and deployment using Flask
Extensible implementation of a Language Model (LLM) training framework in Scala.
Resources regarding Transformers library
Language Modelling (text generation, spell correction) and Sentiment Analysis / POS Tagging with MLP, RNN, CNN and BERT models and LLM prompting
A dedicated convenient repo for different Music Transformers implementations (Reformer/XTransformer/Sinkhorn/etc)
An ASR (Automatic Speech Recognition) adversarial attack repository.
PyTorch code for cross-modal-retrieval on Flickr8k/30k using Bert and EfficientNet
Testing of the possible use of transformers model for various NLP tasks leveraging BERT pretrained model from Hugginface
ML (Machine Learning)/NLP project for extraction of Names, Aadhar ID, PAN numbers. It uses Pytesseract OCR for extracting text from images, and uses Hugging Face NER model for Name extraction and regex library for extracting PAN and Aadhaar ID numbers.
Re-implementation of the method proposed in the paper "Style Aligned Image Generation via Shared Attention" in PyTorch.
A study to benchmark whisper based ASRs in Malayalam
Add a description, image, and links to the transformers-library topic page so that developers can more easily learn about it.
To associate your repository with the transformers-library topic, visit your repo's landing page and select "manage topics."