-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Ignoring env, being weird with env
bug
Something isn't working
#4771
opened Jun 1, 2024 by
RealMrCactus
server.log grows indefinitely on windows
bug
Something isn't working
#4770
opened Jun 1, 2024 by
dhiltgen
Infinetely generating irrelavent response when running phi3-mini in Linux Terminal
bug
Something isn't working
#4769
opened Jun 1, 2024 by
MomenAbdelwadoud
Model response corruption and leaking data between session.
bug
Something isn't working
#4767
opened Jun 1, 2024 by
MarkWard0110
ollama stop [id of running model]
feature request
New feature or request
#4764
opened Jun 1, 2024 by
mrdev023
I created Ollama - Open WebUI Script - Give it a try!
feature request
New feature or request
#4763
opened Jun 1, 2024 by
Special-Niewbie
Add this web app to the list of apps in the README
feature request
New feature or request
#4758
opened May 31, 2024 by
greenido
(windows) ollama model download will not keep on downloading when reopen ollama
feature request
New feature or request
#4755
opened May 31, 2024 by
waldolin
FROM is not recognized
bug
Something isn't working
#4753
opened May 31, 2024 by
EugeoSynthesisThirtyTwo
Multi-GPU and batch management
feature request
New feature or request
#4752
opened May 31, 2024 by
LaetLanf
Garbage output running llama3 GGUF model
bug
Something isn't working
#4750
opened May 31, 2024 by
DiptenduIDEAS
OLLAMA_MODELS not applied on initial start or on restart after upgrade on macOS
feature request
New feature or request
#4749
opened May 31, 2024 by
vernonstinebaker
CMake Error at CMakeLists.txt:2 (project): Generator System.Management.Automation.RemoteException Ninja System.Management.Automation.RemoteException does not support platform specification, but platform
bug
Something isn't working
#4745
opened May 31, 2024 by
chaoqunxie
sensitivity to slow or unstable internet
bug
Something isn't working
#4739
opened May 31, 2024 by
logiota
Unable to Change Ollama Models Directory on Linux (Rocky9)
bug
Something isn't working
#4732
opened May 30, 2024 by
pykeras
llama3:8b-instruct performs much worse than llama3-8b-8192 on groq
bug
Something isn't working
#4730
opened May 30, 2024 by
mitar
have an NVIDIA GPU, but can not use.
bug
Something isn't working
nvidia
Issues relating to Nvidia GPUs and CUDA
#4726
opened May 30, 2024 by
pengyuxiang1
Slower performance on Arm64 with Phi3 and Lexi-Llama on 1.39
bug
Something isn't working
performance
#4722
opened May 30, 2024 by
khanumballz
Ollama unload model when embedding a large pdf file
bug
Something isn't working
#4720
opened May 30, 2024 by
travisgu
Codestral doesn't output correct response
bug
Something isn't working
#4713
opened May 30, 2024 by
jasonhotsauce
Adding function calling support for Agents management
feature request
New feature or request
#4711
opened May 29, 2024 by
flefevre
Previous Next
ProTip!
Adding no:label will show everything without a label.