Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
-
Updated
Dec 18, 2023 - Makefile
Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
Extremely simple chat interface for ollama models.
Odin Runes, a java-based GPT client, liberates you from vendor lock-in, allowing seamless interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
Your gateway to both Ollama & Apple MlX models
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
Desktop UI for Ollama made with PyQT
Spring break project for easier access to 'ollama' language models.
one chat UI for ollama
ollama web_ui simple and easy
Automated (unofficial) Docker Hub mirror of tagged images on open-webui's GHCR repo
docker compose to load ollama, flowise, langfuse, open-web-ui
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Witsy: desktop AI assistant
A simple interface for interacting with LLMs via a local installation of Ollama
一个通过ollama API与本地LLMs聊天的小工具(a web application for chatting with local LLMs by ollama API)
Ollama Chat is a GUI for Ollama designed for macOS.
An Ollama client made with GTK4 and Adwaita
Add a description, image, and links to the ollama-gui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-gui topic, visit your repo's landing page and select "manage topics."