You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the LLama 3 70B model from Bedrock with the Continue.dev extension in VS Code, I'm encountering the following error:
Error: HTTP 400 Bad Request from https://bedrock-runtime.us-west-2.amazonaws.com/model/meta.llama3-70b-instruct-v1:0/invoke {"message":"Malformed input request: #: required key [prompt] not found#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [anthropic_version] is not permitted, please reformat your input and try again."}
This error occurs when trying to invoke the LLama 3 70B model through the Continue.dev extension from Bedrock. However, the extension works correctly with Anthropic models from Bedrock.
Could you please investigate this error and provide guidance on how to resolve it? Having the ability to use the LLama 3 70B model within Continue.dev would be valuable.
Thank you for your support!
To reproduce
Install VSCode, Continue.dev, configure AWS CLI profile with name bedrock
From the error: Malformed input request: #: required key [prompt] not found#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [anthropic_version] is not permitted, please reformat your input and try again.
Before submitting your bug report
Relevant environment info
Description
When using the LLama 3 70B model from Bedrock with the Continue.dev extension in VS Code, I'm encountering the following error:
This error occurs when trying to invoke the LLama 3 70B model through the Continue.dev extension from Bedrock. However, the extension works correctly with Anthropic models from Bedrock.
Could you please investigate this error and provide guidance on how to resolve it? Having the ability to use the LLama 3 70B model within Continue.dev would be valuable.
Thank you for your support!
To reproduce
The text was updated successfully, but these errors were encountered: