New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dcoker 部署 vllm 出现 404 Not Found #271
Comments
发一下,请求的脚本 |
我是用psotman 请求的 显示 |
请求的地址有加 /v1 吗? |
|
配置为 ENGINE=vllm 报错 ENGINE=default 正常 |
那应该是vllm安装没有成功 |
|
你 docker build 镜像的时候用的哪个docker File ?换成 vllm 那个 |
换的是这个 |
同样的问题, 最新的代码, 使用docker-compose vllm部署, GPU只有embedding模型的占用, 日志也不报错. 请求404 LOG
|
api-for-open-llm/api/models.py Line 100 in e46e480
此处加了一行打印异常日志 今天下午才拉取的代码, 重新构建的镜像, 期间没有任何报错
可能相关的问题 vllm-project/vllm#3528 |
docker 部署 vllm qwen模型 启动成功,调用出现 "POST /v1/chat/completions HTTP/1.1" 404 Not Found 什么原因无法解决,请大神帮帮忙.
http://127.0.0.1:7891/docs 显示 No operations defined in spec!
2024-05-09 07:00:03 INFO: Started server process [1]
2024-05-09 07:00:03 INFO: Waiting for application startup.
2024-05-09 07:00:03 INFO: Application startup complete.
2024-05-09 07:00:03 INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
2024-05-09 07:00:04 INFO: 172.16.1.1:57384 - "POST /v1/chat/completions HTTP/1.1" 404 Not Found
The text was updated successfully, but these errors were encountered: