We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is your feature request related to a problem? Please describe. Not.
Why do you need this feature? Cause sometimes no internet connection and still need a translation service. Offline Llama provides this feature.
The solution you'd like Just like the solution TTime provides. A connection with Ollama.
No response
The text was updated successfully, but these errors were encountered:
#799 provides custom model supports. For ollama, you can use http://localhost:11434/v1/chat/completions and specify your local model name.
http://localhost:11434/v1/chat/completions
Sorry, something went wrong.
windingwind
No branches or pull requests
Is there an existing issue for this?
Environment
Describe the feature request
Is your feature request related to a problem? Please describe.
Not.
Why do you need this feature?
Cause sometimes no internet connection and still need a translation service. Offline Llama provides this feature.
Describe the solution you'd like
The solution you'd like
Just like the solution TTime provides. A connection with Ollama.
Anything else?
No response
The text was updated successfully, but these errors were encountered: