Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
May 9, 2024 - Python
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
AIConfig is a config-based framework to build generative AI applications.
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
Python SDK for running evaluations on LLM generated responses
Friendli: the fastest serving engine for generative AI
Miscellaneous codes and writings for MLOps
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."