Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Amazon Bedrock : amazon.titan-text-express-v1 : how can I setup it to use code generation on jupyter lab #769

Closed
technqvi opened this issue May 1, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@technqvi
Copy link

technqvi commented May 1, 2024

Hi
I attempted to use Amazon amazon.titan-text-express-v1 but, It arose the error despite having setting AWS account in local develop (Window Operation).
As screenshot below, I left AWS profile and region textbox with blank values because of using default ,
Should it be enough to run code generation? the below the error that I excerpt from total error
How can I tackle this problem?

image

File ~\AppData\Roaming\Python\Python311\site-packages\langchain_community\llms\bedrock.py:552, in BedrockBase._prepare_input_and_invoke(self, prompt, system, messages, stop, run_manager, **kwargs)
    547     text, body, usage_info = LLMInputOutputAdapter.prepare_output(
    548         provider, response
    549     ).values()
    551 except Exception as e:
--> 552     raise ValueError(f"Error raised by bedrock service: {e}")
    554 if stop is not None:
    555     text = enforce_stop_tokens(text, stop)

ValueError: Error raised by bedrock service: An error occurred (ResourceNotFoundException) when calling the InvokeModel operation: Could not resolve the foundation model from the provided model identifier.

I try to search for similar issues, yet I haven't seen it yet

@technqvi technqvi added the bug Something isn't working label May 1, 2024
Copy link

welcome bot commented May 1, 2024

Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗

If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
welcome
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋

Welcome to the Jupyter community! 🎉

@technqvi technqvi changed the title Amazon Bedrock : amazon.titan-text-express-v1 : how can I setup it to use Amazon Bedrock : amazon.titan-text-express-v1 : how can I setup it to use code generation on jupyter lab May 1, 2024
@srdas
Copy link
Collaborator

srdas commented May 1, 2024

Thanks for asking this. I tested this, and it is working, so this is likely to be an error in set up in your AWS account.
image

When I checked the Bedrock page, it looks like the titan-text-express-v1 is not available in your region ap-southeast-1 (Singapore) at my end, though can you check this also in your own account, as it may be available to you in your region? Here is what I see when I switch to ap-southeast-1:
image

Hope this helps!

@technqvi
Copy link
Author

technqvi commented May 1, 2024

I got it I

so this is likely to be an error in set up in your AWS account.

Hey! thank you for your help.
I forgot specifying region which is "us-east-1" that enable Generative-AI and I can use chatbot.
image

But I cannot use on jupyter lab code.
image

I supposed I incorrectly specified %%ai bedrock:amazon.titan-text-express-v1

ValueError: Error raised by bedrock service: An error occurred (ResourceNotFoundException) when calling the InvokeModel operation: Could not resolve the foundation model from the provided model identifier

@srdas
Copy link
Collaborator

srdas commented May 1, 2024

That is strange, glad you at least have chat (Jupyternaut) working. I tried your exact prompt and it is working fine, see
image

I doubt this is the problem, but can you check in your console on the Amazon Bedrock page if you have access granted?
image

@technqvi
Copy link
Author

technqvi commented May 2, 2024

I keep getting the same error.
image

@srdas
Copy link
Collaborator

srdas commented May 2, 2024

@technqvi -- Not sure where the problem lies. Would you try to upgrade to the latest versions of jupyter lab and jupyter-ai? Also, are the other LLMs working? Is this the only one that is failing?

@srdas
Copy link
Collaborator

srdas commented May 13, 2024

@technqvi Hi -- I hope you were able to resolve this problem. I cannot recreate the same and I hope upgrading to the latest releases (jupyter-ai 2.15 was out last Friday) will resolve it. I will close out the issue for now.

@srdas srdas closed this as completed May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants