Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Serving] Add a table of models and corresponding supported parameters #51

Open
KepingYan opened this issue Jan 11, 2024 · 0 comments
Open

Comments

@KepingYan
Copy link
Contributor

We provide many model configuration files in inference/models, but users cannot clearly know which parameter can be enabled, such as ipex, deepspeed and vllm. We can add a table to illustrate it.

@KepingYan KepingYan changed the title [Serving] Add a tale of models and corresponding supported parameters matching [Serving] Add a table of models and corresponding supported parameters matching Jan 11, 2024
@KepingYan KepingYan changed the title [Serving] Add a table of models and corresponding supported parameters matching [Serving] Add a table of models and corresponding supported parameters Jan 11, 2024
zhangjian94cn pushed a commit to zhangjian94cn/llm-on-ray that referenced this issue Feb 4, 2024
* slim dockerfile

* remove credentials

* rename

* add postfix

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant