Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
-
Updated
Jun 6, 2024 - Python
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Research and Materials on Hardware implementation of Transformer Model
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
Train transformer-based models.
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Neural Network Compression Framework for enhanced OpenVINO™ inference
Lightning-Fast Text Classification with LLM Embeddings on CPU
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Minimal keyword extraction with BERT
This project as built for the Northern Virginia Technology Education Initiative. A local Retrieval Augmented Generation method to complete any tasks. This specific implementation is built for accelerating email writing using previously written emails. A couple of changes and this can be suitable for any task!
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
Precedent decision retrieval system with BM25+BERT
BertChunker: Efficient and Trained Chunking for Unstructured documents. 训练Bert做文档语义分段.
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Bias evaluation of Differentially Private NLP models
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."