-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama create fails when using a utf16 Modelfile #4503
Comments
Be careful about the spaces, first try to copy modelfile's string from : https://github.com/ollama/ollama/blob/main/docs/api.md#request-13 |
@dehlong I just created a model using that modelfile and it worked fine. Was there anything else you had added into it? |
The Modelfile looks like:
|
Nope, it was just this. |
I've been having the same issue on Windows for a few days. At this point I'm sure it's not an issue with the model file. I've uninstalled and reinstalled ollama. Using OpenWebUI model builder does work, but I need to use ollama create for reasons. @dehlong I'll let you know if I trip into an answer but at this point I feel pretty stuck. For clarity this has been working for me previously. I've used ollama create successfully dozens of times. It broke for me with the last update. |
I'm wondering if this is just a problem w/ msdos files adding a carriage return+line feed to the end of each line in the file. Can either of you add the file to the issue? I think you should be able to drag+drop it into a comment. |
Does this work? edit: TBC - having the same problem with other modelfiles that show up as type '3 file' and '6 file' |
@duck1y Perfect. I was able to reproduce the problem using that file. Will try to sort out a fix now. |
The problem turns out to be the file is a utf16 file and we're trying to parse it as a utf8 file. The temporary work around is you can convert the file in powershell using the command:
I have a fix I'm working on, which hopefully we can get into |
@pdevine the workaround worked for me - thank you so much!! I appreciate you!!! Is it clear that this is likely to be the source of @dehlong 's issue aswell? Ty both :):) |
@duck1y it's almost certainly the same problem. |
@pdevine If you can, the big guy can also help look at this error |
@Anorid can you create a new issue for that problem and post the Modelfile along with any relevant info (like if you're trying to use a converted model where you got the weights from)? This is definitely a different issue than the one you posted. |
I've created a new issue and posted the relevant information |
sorry. i try to do as instructed. still error: get-content LLMteacher-modelfile | out-file LLMtest -Encoding ascii Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" windows 11 what do i do wrong |
here are more details: Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" |
For anyone having problems creating a model... (On Windows) Not sure why it's not told in a straightforward manner but that's how I do it. Works great. |
This should be fixed w/ |
thank you both!. workaround helped me! thank you! |
What is the issue?
Hello,
I try to create a new model and mo matter what the model file is, 90% of the time I get:
Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
Is there any solution to this?
This is my modelfile:
FROM llama3
PARAMETER temperature 1
PARAMETER num_ctx 4096
SYSTEM You are Mario from super mario bros, acting as an assistant.
OS
Linux
GPU
Other
CPU
Intel
Ollama version
0.1.38
The text was updated successfully, but these errors were encountered: