Skip to content

Support for Edge AI / ARM devices #938

Answered by parano
highvight asked this question in General
Discussion options

You must be logged in to vote

Hi @highvight, great question and summary of the challenges on edge serving!

Yes indeed, currently BentoML works well for most model serving deployments on the cloud or data center(including online API serving, offline batch serving, distributed batch serving, and streaming serving) but not well suited for edge serving. Here are a bit more context and reasons why this is the current state:

The main benefit of BentoML is 1) providing an abstract for data scientists to describe how clients interact with their model, and automatically packaging all code and dependencies required into BentoML bundle format, 2) provide high-performance and flexible runtime to serve this BentoML bundle format.

T…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@highvight
Comment options

@lonelygo
Comment options

Answer selected by highvight
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants