a self-hosted webui for 30+ generative ai
-
Updated
May 10, 2024 - Python
a self-hosted webui for 30+ generative ai
Setup and run a local LLM and Chatbot using consumer grade hardware.
Local character AI chatbot with chroma vector store memory and some scripts to process documents for Chroma
Tool for test diferents large language models without code.
AgentX is an Open-source library that help people use LLMs on their own computers or help them to serve LLMs as easy as possible that support multi-backends like PyTorch, llama.cpp, Ollama and EasyDeL
A quick and optimized solution to manage llama based gguf quantized models, download gguf files, retreive messege formatting, add more models from hf repos and more. It's super easy to use and comes prepacked with best preconfigured open source models: dolphin phi-2 2.7b, mistral 7b v0.2, mixtral 8x7b v0.1, solar 10.7b and zephyr 3b
A financial chatbot powered by an LLM and retrieval-augmented generation.
YouTube API implementation with Meta's Llama 2 to analyze comments and sentiments
Serving open source models of your choice in as a docker container using llama-cpp-python's OpenAI compatible server
Repository for the Cybersecurity-M project course of professor M. Colajanni
Huginn Hears is a local app that transcribes and summarizes your meetings in Norwegian and English, using state-of-the-art models and open-source libraries. No cloud needed, run everything offline.
A Genshin Impact Question Answer Project supported by Qwen1.5-14B-Chat
MINeD Hackathon 2024 - Project
llama-cpp-python(llama.cpp)で実行するGGUF形式のLLM用の簡易Webインタフェースです。
UnOfficial Gradio Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
Add a description, image, and links to the llama-cpp-python topic page so that developers can more easily learn about it.
To associate your repository with the llama-cpp-python topic, visit your repo's landing page and select "manage topics."