Skip to content

How does BentoML compare to ray-project/ray? #1135

Answered by parano
tangyong asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @tangyong, Ray Serve is probably more comparable to BentoML, it is just a small component in Ray. Here are the three main differences between Ray Serve and BentoML:

  • Ray serve only works in a ray cluster, BentoML allows deploying to many different platforms, including Kubernetes, OpenShift, AWS SageMaker, AWS Lambda, Azure ML, GCP, Heroku - and batch inference job on Apache Spark, Apache Airflow, etc.

  • Ray serve only supports serving web HTTP traffic, BentoML supports online API serving via REST/HTTP and GRPC(roadmap), plus batch offline serving, programmatic access to your model(Python API, PyPI), as well as deploying model as a distributed batch or streaming job on Spark.

  • BentoML…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by tangyong
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants