Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Offline Llama model support request. #793

Closed
1 task done
ydlee1994 opened this issue Apr 28, 2024 · 1 comment
Closed
1 task done

[Feature] Offline Llama model support request. #793

ydlee1994 opened this issue Apr 28, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@ydlee1994
Copy link

ydlee1994 commented Apr 28, 2024

Is there an existing issue for this?

  • I have searched the existing issues

Environment

  • OS: Win 11
  • Zotero Version: 6.0.36
  • Plugin Version: latest.

Describe the feature request

Is your feature request related to a problem? Please describe.
Not.

Why do you need this feature?
Cause sometimes no internet connection and still need a translation service. Offline Llama provides this feature.

Describe the solution you'd like

The solution you'd like
Just like the solution TTime provides. A connection with Ollama.

Anything else?

No response

@ydlee1994 ydlee1994 added the enhancement New feature or request label Apr 28, 2024
@GrayXu
Copy link
Contributor

GrayXu commented May 22, 2024

#799 provides custom model supports.
For ollama, you can use http://localhost:11434/v1/chat/completions and specify your local model name.

@windingwind windingwind closed this as not planned Won't fix, can't repro, duplicate, stale May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants