Skip to content

Flowm/llm-stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Stack

An all-in-one Docker Compose config for providing access to local and external LLMs with multiple chat interfaces.

Components

  • Caddy: Acts as central entrypoint for the whole stack
  • Ollama: Provides access to local LLM models
  • LiteLLM: OpenAI compatible API proxy for local Ollama provided models and upstream models
  • Multiple ChatGPT-style web interfaces for interacting with the LLM models

Models

  • Local
    • local-mistral
    • local-mixtral-8x7b
    • local-llama3-8b
  • OpenAI
    • openai-gpt-3.5-turbo
    • openai-gpt-4-turbo
    • openai-gpt-4o
  • Google
    • google-gemini-1.5-pro
  • Anthropic
    • anthropic-claude-3-sonnet
    • anthropic-claude-3-opus
  • Groq
    • groq-llama3-70b

Chat Frontends

Getting Started

Prerequisites

  • Docker
  • Docker Compose
  • Git

Setup

  1. Clone this repository
  2. Copy the default config cp default.env .env
  3. Edit .env and add the relevant API keys
  4. Start the Docker Compose configuration: docker-compose up
  5. Access the Caddy webserver at http://localhost:3000

About

Docker compose config for local and hosted llms with multiple chat interfaces

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published