Skip to content

apoStyLEE/nuxt-ollama-chat

Repository files navigation

Nuxt Ollama Chat

About

Nuxt Ollama Chat is an open source and minimal chat UI for Ollama. UI Uses Ollama stream API.

Model Selector Code Highlights

Running Localy

1. Clone Repo

git clone https://github.com/apoStyLEE/nuxt-ollama-chat

2. Install Dependencies

# npm
npm install

# pnpm
pnpm install

# yarn
yarn install

# bun
bun install

3. Run Ollama server

Either via the cli:

ollama serve

or via the desktop client

4. Run App (Development Server)

Start the development server on http://localhost:3000:

# npm
npm run dev

# pnpm
pnpm run dev

# yarn
yarn dev

# bun
bun run dev

Check out the deployment documentation for more information.

About

Nuxt Ollama Chat is an open source and minimal chat UI for Ollama. UI Uses Ollama stream API.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published