Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Ollama integration #156

Open
mtompkins opened this issue Feb 4, 2024 · 2 comments
Open

[Feature Request] Ollama integration #156

mtompkins opened this issue Feb 4, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request PR welcome Pull request welcome

Comments

@mtompkins
Copy link

mtompkins commented Feb 4, 2024

💭 Describe the feature

In addition to OpenAI and its associated cost it would be great if we might be able to use Ollama

💡 Proposed Solution

Extend the AI integration to use Ollama which allows running various LLM models locally without a fee.

@mtompkins mtompkins added the enhancement New feature or request label Feb 4, 2024
@Zhengqbbb Zhengqbbb added the PR welcome Pull request welcome label Feb 5, 2024
@gandli
Copy link

gandli commented Mar 8, 2024

最近,ollama 已实现对 OpenAI 的兼容性。czg 支持 Ollama 一个可行办法是利用 ‘ollama cp’ 命令,将现有模型名复制到一个临时名。在命令行中具体执行像下面这样的命令:

ollama cp gemma gpt-3.5-turbo

接下来,修改 .czrc 文件内容,仿如下所示:

{
  "openAIToken": " ",
  "apiEndpoint": "http://localhost:11434/v1"
}

@Zhengqbbb
Copy link
Owner

Zhengqbbb commented Mar 11, 2024

最近,ollama 已实现对 OpenAI 的兼容性。czg 支持 Ollama 一个可行办法是利用 ‘ollama cp’ 命令,将现有模型名复制到一个临时名。在命令行中具体执行像下面这样的命令:

感谢! 因为电脑容量不够,迟迟没有开始这方面的开发 🫠


接下来,修改 .czrc 文件内容,仿如下所示:

这里推荐使用命令的方式进行配置,ai 配置的加载与其他配置的加载路径不同

npx czg --api-key=" " --api-endpoint="http://localhost:11434/v1"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request PR welcome Pull request welcome
Projects
None yet
Development

No branches or pull requests

3 participants