A simple terminal-based chat application demonstrating how to use EasyLLM for interactive AI conversations.
npm install @llmvin/easyllm
Set your API key as an environment variable:
export EASYLLM_API_KEY="your-llm-vin-api-key"
Get your API key from llm.vin.
node chat.js
- Interactive terminal chat interface
- Real-time AI responses using the
llama4-scout
model - Simple error handling
- Clean exit with "exit" command
This example demonstrates:
- Initializing the EasyLLM client
- Creating chat completions with the OpenAI-compatible API
- Handling user input with Node.js readline
- Basic error handling for API calls
const { EasyLLM } = require('@llmvin/easyllm');
const client = new EasyLLM({
apiKey: process.env.EASYLLM_API_KEY
});
const response = await client.chat.completions.create({
model: 'llama4-scout',
messages: [
{ role: 'user', content: userMessage }
]
});