LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
-
Updated
May 30, 2024 - Python
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
🤯 Lobe Chat - an open-source, modern-design ChatGPT/LLMs UI/Chat Framework. Supports speech-synthesis, multi-modal, and extensible plugin system. One-click FREE deployment of your private ChatGPT/Gemini/Ollama chat application.
JetBrains extension providing access to state-of-the-art LLMs, such as GPT-4, Claude 3, Code Llama, and others, all for free
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the creation of chatbots, and search the web with AI (in real-time) with ease.
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
Create chatbots with ease
An open-source alternative to GitHub copilot that runs locally.
Web app implementing an AI code review using large language models
Octogen is an Open-Source Code Interpreter Agent Framework
Local CLI Copilot, powered by CodeLLaMa. 💻🦙
Documentation for the twinny Visual Studio Code extension
Use ollama llms for code completion
Deploy a RESTful API Server to interact with Ollama and Stable Diffusion
Add a description, image, and links to the codellama topic page so that developers can more easily learn about it.
To associate your repository with the codellama topic, visit your repo's landing page and select "manage topics."