Standardized Serverless ML Inference Platform on Kubernetes
-
Updated
May 23, 2024 - Python
Standardized Serverless ML Inference Platform on Kubernetes
Hopsworks - Data-Intensive AI platform with a Feature Store
My repo for the Machine Learning Engineering bootcamp 2022 by DataTalks.Club
AWS SageMaker, SeldonCore, KServe, Kubeflow & MLflow, VectorDB
🪐 1-click Kubeflow using ArgoCD
Carbon Limiting Auto Tuning for Kubernetes
Deploying machine learning model using 10+ different deployment tools
Client/Server system to perform distributed inference on high load systems.
Hands-on labs on deploying machine learning models with tf-serving and KServe
A demo to accompany our blogpost "Scalable Machine Learning with Kafka Streams and KServe"
Everything to get industrial kubeflow applications running in production
Kubeflow examples - Notebooks, Pipelines, Models, Model tuning and more
Collection of bet practices, reference architectures, examples, and utilities to deploy Foundation Models with KServe on AWS.
KServe Inference Graph Example
KServe TrustyAI explainer
An end to end machine learning prediction for rossamann store problem
Add a description, image, and links to the kserve topic page so that developers can more easily learn about it.
To associate your repository with the kserve topic, visit your repo's landing page and select "manage topics."