A proof-of-concept on how to install and use Torchserve in various mode
-
Updated
Mar 15, 2024 - Python
A proof-of-concept on how to install and use Torchserve in various mode
A message queue based server architecture to asynchronously handle resource-intensive tasks (e.g., ML inference)
TorchServe images with specific Python version working out-of-the-box.
Serving PyTorch model using flask and docker
Universal Semantic Annotator (LREC 2022)
Simple HTTP serving for PyTorch 🚀
Chatting-Day's Dialogue State Tracking (DST)
Segmenting people on photos using IOS devices [Pytorch; Unet]
A collection of model deployment library and technique.
ClearML - Model-Serving Orchestration and Repository Solution
In this repository, I will share some useful notes and references about deploying deep learning-based models in production.
Add a description, image, and links to the serving-pytorch-models topic page so that developers can more easily learn about it.
To associate your repository with the serving-pytorch-models topic, visit your repo's landing page and select "manage topics."