Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama.cpp 上运行问题 #166

Closed
32ns opened this issue Apr 28, 2024 · 6 comments
Closed

Llama.cpp 上运行问题 #166

32ns opened this issue Apr 28, 2024 · 6 comments

Comments

@32ns
Copy link

32ns commented Apr 28, 2024

Describe the bug
我不知道在idea上是不是运行良好,但是在goland和clion上基本无法正常使用,我自己本地的模型,试了好多都是输出正常,但是无法结束,结尾会无限输出,问一个问题要重启一下IDE,1.8.3中添加的停止按钮也不能停止它,点击停止后界面是停了,但是GPU还在算,不过调用API好像没这问题,我没怎么试.
还有就是输出如果太长的话滚动条会定死在一个点,往下滑或者往上滑都会自动滚回原来的点;不等他输出完你就看不到下边的内容,但是它又不会停止,只能停掉server才能看到全部,然后再启动它; 我的server用的是llama.cpp

@phodal
Copy link
Member

phodal commented Apr 28, 2024

看上去像是你的模型 server 有问题,估计需要你的对应实现是否是标准 SSE 。

@phodal phodal changed the title 问题有点多 Llama.cpp 上运行问题 Apr 28, 2024
@32ns
Copy link
Author

32ns commented Apr 28, 2024

一进对java很排斥,也帮不上忙,只能接着等了

@phodal
Copy link
Member

phodal commented Apr 28, 2024

你说的问题,别人都没遇到过……。可能你需要提供更详细的信息,比如操作系统版本,IDE 版本等等。

@32ns
Copy link
Author

32ns commented Apr 28, 2024

该搞个群,能上传上视频,一看就清楚了,有问题也好反馈

@phodal
Copy link
Member

phodal commented May 22, 2024

@phodal phodal closed this as completed May 22, 2024
@32ns
Copy link
Author

32ns commented May 22, 2024

是模型的问题,估计是llama.cpp没有适配结束符,不搞了,换成qwen就好了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants