Skip to content

sblakey/llm-bedrock-anthropic

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 

Repository files navigation

llm-bedrock-anthropic

PyPI License

Plugin for LLM adding support for Anthropic's Claude models.

New: claude opus model

claude-3-opus available on us-west-2

Installation

Install this plugin in the same environment as LLM. From the current directory

llm install llm-bedrock-anthropic

Configuration

You will need to specify AWS Configuration with the normal boto3 and environment variables.

For example, to use the region us-west-2 and AWS credentials under the personal profile, set the environment variables

export AWS_DEFAULT_REGION=us-west-2
export AWS_PROFILE=personal

Usage

This plugin adds models called bedrock-claude and bedrock-claude-instant.

You can query them like this:

llm -m bedrock-claude-instant "Ten great names for a new space station"
llm -m bedrock-claude "Compare and contrast the leadership styles of Abraham Lincoln and Boris Johnson."

Options

  • max_tokens_to_sample, default 8_191: The maximum number of tokens to generate before stopping

Use like this:

llm -m bedrock-claude -o max_tokens_to_sample 20 "Sing me the alphabet"
 Here is the alphabet song:

A B C D E F G
H I J

About

Plugin for https://llm.datasette.io/en/stable/ to enable talking with Claude Instant and ClaudeV2 models on AWS Bedrock

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages