New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When use ollama with model llama3:70b, the code cannot run #1220
Comments
I'm also having this problem. |
same here, with the llama3 8b model. I do get the backtick and also get random responses that don't solve the issue. Sometimes it responds to previous questions even though I ask for something else. |
Same issue running llama3-8b on MacOS 14.4 using LM Studio. |
Same here; ollama/llama3-8b on MacOS 13.5.2 |
same -- goes very mad |
Update to version 0.2.5
|
Seems to be working now after the reinstall. Thanks! |
Tried it and installed most got errors:
Will it work? lol Also still getting this:
(although got this before and still works) |
So when I try with various models I get odd things happening where it's almost telling me it's thoughts or just coming back with odd stuff:
|
latest I have now is it always responding with "a plan" so
I did an update to 0.25 but I think there's been 2 updates to that so how do I know I've got the right one? |
Same issue running llama3 on windows |
Describe the bug
when use ollama with model llama3:70b, the python code return with a "`" ,caused the code cannot be executed.
Reproduce
Expected behavior
it should give me some output
Screenshots
Open Interpreter version
0.2.0
Python version
3.12.2
Operating System name and version
windows 11
Additional context
No response
The text was updated successfully, but these errors were encountered: