New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
服务 #173
Labels
Comments
你的意思是部署在服务端,然后通过请求调用吗 |
@wangzhaode |
是的,希望可以支撑下多并发 |
这个有规划吗,看llama.cpp提供了一个server,可以支持动态batch,不知道mnn-llm后续有规划吗? |
@wangzhaode 同上!期待提供server的功能,感谢你们这非常棒的工作。 |
1 similar comment
@wangzhaode 同上!期待提供server的功能,感谢你们这非常棒的工作。 |
Marking as stale. No activity in 30 days. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
目前看是本地运行的方式,请问目前是否可以部署成openai的服务吗,使用curl等命令来请求。
感谢回复!
The text was updated successfully, but these errors were encountered: