LlamaIndex is a data framework for your LLM applications
-
Updated
Jun 1, 2024 - Python
LlamaIndex is a data framework for your LLM applications
The open-source serverless GPU container runtime.
OneTrainer is a one-stop solution for all your stable diffusion training needs.
Personal Project: MPP-Qwen14B(Multimodal Pipeline Parallel-Qwen14B). Don't let the poverty limit your imagination! Train your own 14B LLaVA-like MLLM on RTX3090/4090 24GB.
Magick is a cutting-edge toolkit for a new kind of AI builder. Make Magick with us!
[ACL 2023] The code for our ACL'23 paper Cold-Start Data Selection for Few-shot Language Model Fine-tuning: A Prompt-Based Uncertainty Propagation Approach
Open source data anonymization and synthetic data orchestration for developers. Create high fidelity synthetic data and sync it across your environments.
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Implementation of various transformer architecture models, application, and fine-tuning code.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
Implementation for the different ML tasks on Kaggle platform with GPUs.
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
Python client library for improving your LLM app accuracy
This repository highlights the LLMs reasoning capabilities of ✨ Mistral / LLaMA-3 / Phi-3 / Gemma / Flan-T5 / GPT-4o ✨ in Targeted Sentiment Analysis in Russian / Translated to English mass-media 📊
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
finetunig GPT2 for detecting spams
Distributed ML Training and Fine-Tuning on Kubernetes
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Add a description, image, and links to the fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the fine-tuning topic, visit your repo's landing page and select "manage topics."