Skip to content
This repository has been archived by the owner on May 28, 2024. It is now read-only.

Serve a new model without restarting RayLLM #130

Open
k6l3 opened this issue Feb 2, 2024 · 1 comment
Open

Serve a new model without restarting RayLLM #130

k6l3 opened this issue Feb 2, 2024 · 1 comment

Comments

@k6l3
Copy link

k6l3 commented Feb 2, 2024

Is it possible to start serving a new model when an event happens?

@samarth-contextual
Copy link

Bumping this.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants