Integrating AI into every workflow with our open-source, no-code platform, powered by the actor model for dynamic, graph-based solutions.
-
Updated
Jun 11, 2024 - TypeScript
Integrating AI into every workflow with our open-source, no-code platform, powered by the actor model for dynamic, graph-based solutions.
A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
The open source Tines alternative. Automate security workflows at scale with code and no-code.
A high-throughput and memory-efficient inference and serving engine for LLMs
RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
The easiest way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Multi-model Inference Graph/Pipelines, LLM/RAG apps, and more!
🐢 Open-Source Evaluation & Testing for LLMs and ML models
Hamilton helps data scientists and engineers define testable, modular, self-documenting dataflows, that encode lineage and metadata. Runs and scales everywhere python does.
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
AI Observability & Evaluation
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Test your prompts, agents, and RAGs. Use LLM evals to improve your app's quality and catch problems. Compare performance of GPT, Claude, Gemini, Llama, and more. Simple declarative configs with command line and CI/CD integration.
Python SDK for experimenting, testing, evaluating & monitoring LLM-powered applications - Parea AI (YC S23)
This is an AI-Gemini Chatbot LLM And Large Image Model Application. You can use this project run into local and ask you images like your talking with in realtime
JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
Add a description, image, and links to the llmops topic page so that developers can more easily learn about it.
To associate your repository with the llmops topic, visit your repo's landing page and select "manage topics."