Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

ollama stop [id of running model] feature request New feature or request
#4764 opened Jun 1, 2024 by mrdev023
Add this web app to the list of apps in the README feature request New feature or request
#4758 opened May 31, 2024 by greenido
FROM is not recognized bug Something isn't working
#4753 opened May 31, 2024 by EugeoSynthesisThirtyTwo
Multi-GPU and batch management feature request New feature or request
#4752 opened May 31, 2024 by LaetLanf
Garbage output running llama3 GGUF model bug Something isn't working
#4750 opened May 31, 2024 by DiptenduIDEAS
Custom-llama issue bug Something isn't working
#4748 opened May 31, 2024 by Ascariota
sensitivity to slow or unstable internet bug Something isn't working
#4739 opened May 31, 2024 by logiota
Unable to Change Ollama Models Directory on Linux (Rocky9) bug Something isn't working
#4732 opened May 30, 2024 by pykeras
llama3:8b-instruct performs much worse than llama3-8b-8192 on groq bug Something isn't working
#4730 opened May 30, 2024 by mitar
dolphin-2.9.2-mixtral-8x22b model request Model requests
#4729 opened May 30, 2024 by psyv282j9d
unlike the v0.1.38, GPU NVIDIA not working with version v0.1.39 in windows ollama.exe bug Something isn't working nvidia Issues relating to Nvidia GPUs and CUDA windows
#4727 opened May 30, 2024 by zouzeTG
have an NVIDIA GPU, but can not use. bug Something isn't working nvidia Issues relating to Nvidia GPUs and CUDA
#4726 opened May 30, 2024 by pengyuxiang1
empty response bug Something isn't working
#4724 opened May 30, 2024 by themw123
Ollama unload model when embedding a large pdf file bug Something isn't working
#4720 opened May 30, 2024 by travisgu
Codestral doesn't output correct response bug Something isn't working
#4713 opened May 30, 2024 by jasonhotsauce
Adding function calling support for Agents management feature request New feature or request
#4711 opened May 29, 2024 by flefevre
s390x build ollama : running gcc failed bug Something isn't working
#4710 opened May 29, 2024 by woale
arm64 llama runner takes a long time to start compared to amd64 arch bug Something isn't working
#4705 opened May 29, 2024 by glenamac
ProTip! What’s not been updated in a month: updated:<2024-05-01.