Skip to content

maxsagt/lambda-openai-stream

Repository files navigation

Streaming is now officially supported by OpenAI: https://github.com/openai/openai-node

Lambda-OpenAI-Stream

About

Lambda-OpenAI-Stream lets you stream OpenAI responses via an AWS Lambda Function URL. It is a simple implementation in vanilla JS. The only (optional) dependency is dotenv.

Example gif

Deployment

Prerequisites

  • General AWS knowledge is helpful.
  • You need to have Docker installed locally.
  • You need to have aws-sam installed locally and configured with your AWS account.

Setup

  1. Clone the repository
    git clone https://github.com/maxsagt/lambda-openai-stream.git
  2. Create the .env in ./src with your OpenAI API key:
    OPENAI_API_KEY=abc123
    
  3. Install dotenv
    npm init
    npm install dotenv
  4. Build and test the Lambda function locally
    sam build
    sam local invoke -e event.json
  5. Deploy to AWS. Note that your AWS user or role needs (temporary) IAM permissions for AWS CloudFormation, S3, Lambda and IAM.
    sam build --cached --parallel
    # Use sam deploy --guided to control the AWS region.
  6. Done. Your Lambda Function URL is displayed in the terminal, and you can find it in AWS. Example

! Please note that this URL is publicly exposed and has no authorization configured at this point.

Future Improvements

  • Add configurations for Amazon CloudFront and AWS WAF to introduce authentication
  • Add an index.html to show how the frontend could work.

Feedback

Feedback and contributions are welcome!

About

Stream OpenAI responses with AWS Lambda Function URL response streaming. UPDATE: Now officially supported by OpenAI https://github.com/openai/openai-node

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published