Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Code completion using local LLM #3360

Open
maxguru opened this issue Apr 30, 2024 · 0 comments
Open

[Feature Request]: Code completion using local LLM #3360

maxguru opened this issue Apr 30, 2024 · 0 comments
Assignees
Labels

Comments

@maxguru
Copy link

maxguru commented Apr 30, 2024

Describe your idea in details

It has now become possible to run local LLMs with a reasonable speed. There is a large selection of models available that can be used. There are many apps out there already that make it easy to run models locally:

It may not even be necessary to require local LLMs to be used. It would be possible to have support for external LLM APIs, should the user decide to go that route.

I suggest a plugin for CodeLite similar to GitHub Copilot's Visual Studio Code extension.

Possible Features:

  • Model / API selection and configuration
  • Code completion in code editor
  • AI Chat in IDE widget
  • Retrieval-augmented generation (RAG) using project files
  • Autonomous task completion using function calling

This feature is critical for CodeLite. It would significantly enhance the coding experience, reducing development time and increasing productivity. It would allow CodeLite to compete with other IDEs that either have this feature already or will have it in the near future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants