(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
-
Updated
Nov 1, 2023 - C#
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
A terminal style user interface to chat with AI characters using llama LLMs for locally processed AI.
PARRoT: Precise Audio Recognition and Recap over Transcription
Filling in the missing gaps with langchain, and creating OO wrappers to simplify some workloads.
GUI for GGML Alpaca models
little single file fronted for llama.cpp/examples/server created with vue-taildwincss and flask
Control what LLMs can, and can't, say
Evaluate open-source language models on Agent, formatted output, command following, long text, multilingual, coding, and custom task capabilities. 开源语言模型在Agent,格式化输出,指令追随,长文本,多语言,代码,自定义任务的能力基准测试。
(Windows/Linux) Local WebUI with neural network models (LLM, Stable Diffusion, AudioCraft, AudioLDM2, TTS, Bark, Whisper, Demucs, LibreTranslate, ZeroScope2, TripoSR, Shap-E, GLIGEN, Wav2Lip, Roop, Rembg, CodeFormer, Moondream 2) on python (In Gradio interface)
Describe images in the WordPress Media Library using a local large language model. Generates titles, captions, descriptions, and alt tags.
Generate documentation using Hugging Face embeddings and local LLMs
A web interface to chat, get embeddings, complete with a LLaMa model
Eternal is an experimental platform for machine learning models and workflows.
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."