Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[本地部署]: chatglm2-6b本地模型无法使用,确认是在models/文件夹下 #1114

Open
2 tasks done
MoonShadow1976 opened this issue May 6, 2024 · 0 comments
Open
2 tasks done
Labels
localhost deployment question Further information is requested

Comments

@MoonShadow1976
Copy link

是否已存在现有反馈与解答?

  • 我确认没有已有issue或discussion,且已阅读常见问题

是否是一个代理配置相关的疑问?

  • 我确认这不是一个代理配置相关的疑问。

错误描述

ai提示说是值与MODEL_METADATA字典不匹配,但是并没有字符上的异常。

复现操作

正常部署,创建虚拟环境,git抓取并配置使用chatglm2-6b模型。
image

错误日志

Opening ChuanhuChatGPT...
venv "F:\APPLICATION\amaconda\envs\ChuanhuChat\Python.exe"
2024-05-06 23:31:39,573 [INFO] [_client.py:1026] HTTP Request: GET https://api.gradio.app/gradio-messaging/en "HTTP/1.1 200 OK"
2024-05-06 23:31:40,834 [INFO] [config.py:327] 默认模型设置为了:chatglm2-6b
2024-05-06 23:31:42,142 [INFO] [utils.py:148] Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
2024-05-06 23:31:42,143 [INFO] [utils.py:161] NumExpr defaulting to 8 threads.
Traceback (most recent call last):
  File "F:\CodeAPP\MapGBT\ChuanhuChatGPT\ChuanhuChatbot.py", line 204, in <module>
    value=i18n(MODEL_METADATA[MODELS[DEFAULT_MODEL]]["description"]),
KeyError: 'chatglm2-6b'

运行环境

- OS: windows 11 23H2
- Browser: edge
- Gradio version: 4.26.0
- Python version: 3.10.14

补充说明

cuda版本3.11

@MoonShadow1976 MoonShadow1976 added localhost deployment question Further information is requested labels May 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
localhost deployment question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant