Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with LLama 3 70B model from Bedrock on Continue.dev #1260

Open
3 tasks
KoStard opened this issue May 10, 2024 · 2 comments
Open
3 tasks

Error with LLama 3 70B model from Bedrock on Continue.dev #1260

KoStard opened this issue May 10, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@KoStard
Copy link

KoStard commented May 10, 2024

Before submitting your bug report

Relevant environment info

- OS: MacOS 14.4.1 (23E224)
- Continue:v0.8.27
- IDE:VSCode 1.89.0

Description

When using the LLama 3 70B model from Bedrock with the Continue.dev extension in VS Code, I'm encountering the following error:

Error: HTTP 400 Bad Request from https://bedrock-runtime.us-west-2.amazonaws.com/model/meta.llama3-70b-instruct-v1:0/invoke {"message":"Malformed input request: #: required key [prompt] not found#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [anthropic_version] is not permitted, please reformat your input and try again."}

This error occurs when trying to invoke the LLama 3 70B model through the Continue.dev extension from Bedrock. However, the extension works correctly with Anthropic models from Bedrock.

Could you please investigate this error and provide guidance on how to resolve it? Having the ability to use the LLama 3 70B model within Continue.dev would be valuable.

Thank you for your support!

To reproduce

  1. Install VSCode, Continue.dev, configure AWS CLI profile with name bedrock
  2. Configure with the following model:
     {
       "title": "Bedrock: Llama3 70b",
       "provider": "bedrock",
       "model": "meta.llama3-70b-instruct-v1:0",
       "region": "us-west-2"
     },
    
3. Try to chat with Llama3 70b

### Log output

```Shell
[Extension Host] Error handling webview message: {
  "msg": {
    "messageId": "a404b96d-3142-4ea4-9283-fa485980fea5",
    "messageType": "llm/streamChat",
    "data": {
      "messages": [
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "test"
            }
          ]
        }
      ],
      "title": "Bedrock: Llama3 70b",
      "completionOptions": {}
    }
  }
}

Error: HTTP 400 Bad Request from <...>/model/meta.llama3-70b-instruct-v1:0/invoke  {"message":"Malformed input request: #: required key [prompt] not found#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [anthropic_version] is not permitted, please reformat your input and try again."}
@KoStard KoStard added the bug Something isn't working label May 10, 2024
@sestinj sestinj self-assigned this May 20, 2024
@sestinj
Copy link
Contributor

sestinj commented May 20, 2024

@KoStard I just merged the PR (#1271 (review)), can you confirm that this solves this issue?

@dsmithn
Copy link

dsmithn commented May 20, 2024

I think it's going to be a bit more work. The api for llama uses a few different parameters. Here's a project which converts the OpenAI messages to work with bedrock llama prompt. https://github.com/jparkerweb/bedrock-wrapper/blob/75603a6f9f37dcc6d83709807d49cf003daf877d/bedrock-wrapper.js#L57

From the error: Malformed input request: #: required key [prompt] not found#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [anthropic_version] is not permitted, please reformat your input and try again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants