Skip to content

Run your Ollama model locally using Ollama UI.

Notifications You must be signed in to change notification settings

fiqryq/ollama-ui

Repository files navigation

About Ollama UI

CleanShot 2024-04-09 at 17 24 06@2x

This is an Ollama UI. You can run Ollama models, similar to ChatGPT, with simple configuration, and it can be used without internet if the model has already been downloaded/pulled

Getting Started

First, run the script for setup.

sh setup.sh

Tech Stack

  • Nextjs
  • Tailwind CSS
  • Shadcn UI
  • LangchainJs
  • Docker

Feature Roadmap

  • Generate completions.
  • Select from a list of local models.
  • Render code blocks in assistant responses.
  • Custom prompts. ... (Add additional features here)

About

Run your Ollama model locally using Ollama UI.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published