You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm specifically inquiring about the predictor module.
It's worth noting that a single model can encompass more than one endpoint. For instance, the CoLBert model features two endpoints: one for embedding and another for reranking.
Question:
Given that a model can accommodate multiple endpoints, is there a method to facilitate more than one endpoint within a single inference service for a single model, without the need to host the model multiple times for different endpoints?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello,
Question:
Given that a model can accommodate multiple endpoints, is there a method to facilitate more than one endpoint within a single inference service for a single model, without the need to host the model multiple times for different endpoints?
Beta Was this translation helpful? Give feedback.
All reactions