Skip to content

aws-samples/qa-app-with-rag-using-amazon-bedrock-and-kendra

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

QA with LLM and RAG (Retrieval Augumented Generation) powered by Amazon Bedrock and Kendra

This project is a Question Answering application with Large Language Models (LLMs) and Amazon Kendra. An application using the RAG(Retrieval Augmented Generation) approach retrieves information most relevant to the user’s request from the enterprise knowledge base or content, bundles it as context along with the user’s request as a prompt, and then sends it to the LLM to get a GenAI response.

LLMs have limitations around the maximum word count for the input prompt, therefore choosing the right passages among thousands or millions of documents in the enterprise, has a direct impact on the LLM’s accuracy.

In this project, Amazon Kendra is used for knowledge base.

The overall architecture is like this:

rag_with_bedrock_kendra_arch

Prerequisites

Before using a foundational model in Bedrock your account needs to be granted access first.

Follow the steps here to add models: Amazon Bedrock - Add model access

Some models require some additional information and take some time before you are granted access. Once the model shows "Access granted" on the Model Access page, you should be able to call the invoke_model() function without the error.

Overall Workflow

  1. Deploy the cdk stacks (For more information, see here).
    • An Amazon Kendra for knowledge base.
  2. Open SageMaker Studio and then open a new terminal.
  3. Run the following commands on the terminal to clone the code repository for this project:
    git clone --depth=1 https://github.com/aws-samples/qa-app-with-rag-using-amazon-bedrock-and-kendra.git
    
  4. Run Streamlit application. (For more information, see here)

References

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.