Skip to content

aldomatic/llamaindex-s3-index-storage

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Llamaindex S3 Index Storage

Using Llamaindex to query an vector index stored in S3. The new LLM Stack. Yay!

How to install a virtual environment using venv

pip  install  virtualenv

To use venv in your project, in your terminal, create a new project folder, cd to the project folder in your terminal, and run the following command:

mkdir  llamaindex-s3-index-storage
cd  llamaindex-s3-index-storage

python3  -m  venv  env

How to activate the virtual environment

You can then activate your new python virtual environment running the command below.

source  ./venv/bin/activate

Install python packages

Install the packages the application requires

pip install -r requirements.txt

Environment variables

You will need a few secrets for this to work. You will have to rename .env.dev to .env for them to load into the application

OPENAI_API_KEY=
AWS_KEY=
AWS_SECRET=
BUCKET=

S3 bucket

You will need to have a public S3 bucket to use as storage for the index/vectors. Note: I used a pubic bucket i have not tested it with a private one.

llamaindex-demo/storage

Running the app

This command will run the steamlit app and open it in a new tab.

streamlit run app.py

Running the app via Docker

Build the image

docker build -t mystreamlitapp .

Run the new build image

docker run -p 8501:8501 mystreamlitapp

About

Using llamaindex to query an index stored in S3. The new LLM Stack.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published