Skip to content

pleomax0730/BERT-DRCD-QuestionAnswering

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

đŸ¤— BERT-DRCD-QuestionAnswering

This project features FastAPI as the backend and Streamlit as the frontend. Our streamlit UI will send the post request to the backend and receive the response. The QA model is deployed using the FastAPI REST service and containerized using Docker. The streamlit UI is also hosted on its own Docker container.

We spin both the containers together using Docker Compose.

Streamlit UI

Navigate to http://127.0.0.1:8501/ after spinning up the application from local machine or docker host.

ui

Interactive API docs

Navigate to http://127.0.0.1:8000/docs after spinning up the application from local machine or docker host.

You will see the automatic interactive API documentation (provided by Swagger UI):

usage

Quick start - Using Docker Compose

Running Docker applications

Spin up our containers in detached mode.

docker-compose up -d
docker ps             # To check the running containers
docker-compose down   # To shutdown the running containers

Quick start - Local Machine

Install the dependencies

Create your virtual environment beforehand as a best practice.

  • Go to the project directory
  • Install the requirements for both FastAPI and Streamlit
pip install -r requirements.txt

Start the FastAPI server

  • Go to the "fastapi" directory
  • Run the following command
uvicorn app.main:app --reload

Start the Streamlit server

  • Go to the "streamlit" directory
  • Run the following command
  • You should change the backend_pred_url in app.py to "http://127.0.0.1:8000/predict/" if running locally
streamlit run app.py

About

đŸ¤—Question Answering model deployment using FastAPI, and Docker

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published