Skip to content

lingdu2012/ram-chat-ollama

Repository files navigation

ramChat for Ollama

About

It is an open source and minimal chat UI for Ollama. UI Uses Ollama stream API. Based on Vue3.js, Nuxt3.js,Taillwindcss, Nuxtui.

Reference the Repo => nuxt-ollama-chat

Model Selector

Running Localy

1. Clone Repo

git clone https://github.com/lingdu2012/ramOllama.git

2. Install Dependencies

# npm
npm install

# pnpm
pnpm install

# yarn
yarn install

# bun
bun install

3. Run Ollama server

Either via the cli:

ollama serve

or via the desktop client

4. Run App (Development Server)

Start the development server on http://localhost:3000:

# npm
npm run dev

# pnpm
pnpm run dev

# yarn
yarn dev

# bun
bun run dev

Check out the more information

Nuxt.js Documentation

NuxtUI Documentation

Ollama Documentation

Taillwindcss Documentation