🔍 AI search engine - self-host with local or cloud LLMs
-
Updated
Jun 2, 2024 - TypeScript
🔍 AI search engine - self-host with local or cloud LLMs
Your GenAI Second Brain 🧠 A personal productivity assistant (RAG) ⚡️🤖 Chat with your docs (PDF, CSV, ...) & apps using Langchain, GPT 3.5 / 4 turbo, Private, Anthropic, VertexAI, Ollama, LLMs, Groq that you can share with users ! Local & Private alternative to OpenAI GPTs & ChatGPT powered by retrieval-augmented generation.
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All and Jan) and Cloud based LLMs to help review, test, explain your project code.
Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.
Groq is a Python script that enables conversational interfaces with AI models from the command line. It allows users to interact with multiple AI models, save conversation history, and engage in natural-sounding conversations. With a simple setup and easy-to-use interface, Groq makes it easy to explore the world of AI-driven conversations.
Movie finder app with semantic search and sentiment analysis using OpenAI, Groq, and Qdrant
FreeGenius AI, an advanced AI assistant that can talk and take multi-step actions. Supports numerous open-source LLMs via Llama.cpp or Ollama or Groq Cloud API, with optional integration with AutoGen agents, OpenAI API, Google Gemini Pro and unlimited plugins.
AutoGroq is a groundbreaking tool that revolutionizes the way users interact with Autogen™ and other AI assistants. By dynamically generating tailored teams of AI agents based on your project requirements, AutoGroq eliminates the need for manual configuration and allows you to tackle any question, problem, or project with ease and efficiency.
A serverless invite-only AI-powered chat bot on Telegram.
Super Fast 0-Shot AI assistant in the terminal
A tailored Chatbot to reduce hallucinations and improve factuality.
Drop in replacement for the OpenAI Assistants API
Open source Python SDK for agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
This repository provides a framework to integrate internet search capabilities with a Language Learning Model (LLM), specifically using Gemini 1.5 API. This allows the LLM to fetch and use real-time data from the internet to enhance its responses to user queries.
Add a description, image, and links to the groq topic page so that developers can more easily learn about it.
To associate your repository with the groq topic, visit your repo's landing page and select "manage topics."