Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support of custom endpoints or local LLM endpoints #198

Open
jagad89 opened this issue Jan 22, 2024 · 4 comments
Open

Add support of custom endpoints or local LLM endpoints #198

jagad89 opened this issue Jan 22, 2024 · 4 comments
Assignees
Labels
feature New feature or request P2 Long term work

Comments

@jagad89
Copy link

jagad89 commented Jan 22, 2024

No description provided.

@basicthinker
Copy link
Contributor

@jagad89 we don't support this out of the box.

The plan is to enable local endpoints if their APIs are compatible with OpenAI's. Then you could create a provider with the base URL set to your endpoint, in a similar way like below, and change the providers of your models in the Settings as well.

image

Is this meeting your requirement? Could you kindly tell more about what local LLMs you are using and what scenarios you plan to use DevChat for?

In any event, we will let you know when the above feature is ready.

TODOs:

  • @yangbobo2021 make sure the Settings are sync'ed with ~/.chat/config.yml and new providers in the config file is shown in Settings.
  • @basicthinker write docs for this feature.

@basicthinker basicthinker added feature New feature or request P1 Important but not that urgent labels Jan 22, 2024
@jagad89
Copy link
Author

jagad89 commented Jan 23, 2024

I will try the settings with few custom and local endpoints. I will keep posted in same thread.

Basically we will able to use local LLM over intranet without internet connection.

@runjinz runjinz added P0 Urgent and, hopefully, important and removed P1 Important but not that urgent labels Mar 27, 2024
@runjinz runjinz assigned basicthinker and unassigned yangbobo2021 Mar 28, 2024
@runjinz runjinz assigned runjinz and basicthinker and unassigned basicthinker and runjinz Apr 7, 2024
@runjinz runjinz added P2 Long term work and removed P0 Urgent and, hopefully, important labels Apr 7, 2024
@runjinz
Copy link

runjinz commented Apr 7, 2024

image

The configuration modification entry has been changed. Additionally, if it is a local model, the model configuration file needs to be manually modified. Interface modification is not supported at the moment, but it will be planned for modification along with some customer demands in the next steps.

@trevorstr
Copy link

I would like to use this extension with a Local LLM, so I don't have to sign up for a paid service. Ollama is a great way to set up a local endpoint on a server or developer workstation.

https://ollama.com/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request P2 Long term work
Projects
None yet
Development

No branches or pull requests

5 participants