Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can deeprec processor be used in triton inference server? #942

Open
supercocoa opened this issue Nov 1, 2023 · 2 comments
Open

Can deeprec processor be used in triton inference server? #942

supercocoa opened this issue Nov 1, 2023 · 2 comments

Comments

@supercocoa
Copy link

We use triton inference server for online inference, Can deeprec processor be used in triton inference server?

@candyzone
Copy link
Collaborator

candyzone commented Nov 9, 2023

Not supported yet.
which feature do you want to use?
TensorFlow/DeepRec saved_model can deploy in TritonServer deploy TensorFlow.
If you wanna DeepRec feature, it supports.
If you wanna SessionGroup, you can use TFServing or build RPC Server yourself.

@supercocoa
Copy link
Author

We use triton for all model serving task(recommendation / nlp / cv), We want a
triton deeprec backend with all the deeprec processor features for deeprec model serving: https://deeprec.readthedocs.io/en/latest/Processor.html @candyzone

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants