Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://codellama.h2o.ai/
-
Updated
May 9, 2024 - Python
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://codellama.h2o.ai/
Firefly: 大模型训练工具,支持训练Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
An efficient, flexible and full-featured toolkit for fine-tuning large models (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more from a powerful terminal user interface.
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
Build LLM-powered robots in your garage with MachinaScript For Robots!
🦙 Free and Open Source Large Language Model (LLM) chatbot web UI and API. Self-hosted, offline capable and easy to setup. Powered by LangChain.
AI stack for interacting with LLMs, Stable Diffusion, Whisper, xTTS and many other AI models
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Design, conduct and analyze results of AI-powered surveys and experiments. Simulate social science and market research with large numbers of AI agents and LLMs.
Bypass restricted and censored content on AI chat prompts 😈
Add a description, image, and links to the mixtral topic page so that developers can more easily learn about it.
To associate your repository with the mixtral topic, visit your repo's landing page and select "manage topics."