Skip to content

A simple terminal-based chat application demonstrating how to use EasyLLM for interactive AI conversations

License

Notifications You must be signed in to change notification settings

llm-vin/easyllm-js-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EasyLLM Terminal Chat Example

A simple terminal-based chat application demonstrating how to use EasyLLM for interactive AI conversations.

Installation

npm install @llmvin/easyllm

Configuration

Set your API key as an environment variable:

export EASYLLM_API_KEY="your-llm-vin-api-key"

Get your API key from llm.vin.

Running the Example

node chat.js

Features

  • Interactive terminal chat interface
  • Real-time AI responses using the llama4-scout model
  • Simple error handling
  • Clean exit with "exit" command

Code Overview

This example demonstrates:

  • Initializing the EasyLLM client
  • Creating chat completions with the OpenAI-compatible API
  • Handling user input with Node.js readline
  • Basic error handling for API calls
const { EasyLLM } = require('@llmvin/easyllm');

const client = new EasyLLM({
  apiKey: process.env.EASYLLM_API_KEY
});

const response = await client.chat.completions.create({
  model: 'llama4-scout',
  messages: [
    { role: 'user', content: userMessage }
  ]
});

Learn More

About

A simple terminal-based chat application demonstrating how to use EasyLLM for interactive AI conversations

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published