Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recommended approach to use ML inference #23

Open
Krith-man opened this issue Nov 20, 2022 · 1 comment
Open

Recommended approach to use ML inference #23

Krith-man opened this issue Nov 20, 2022 · 1 comment

Comments

@Krith-man
Copy link

Hello this repo is very helpful, but it is 4 years old.
Is this still the recommended way to use ML inference with Kafka Streams?

@kaiwaehner
Copy link
Owner

Yes. Absolutely.

Embedding an analytic model is the appropriate way to do reliable model scoring with low latency.

Some model servers also add native Kafka interfaces (see, e.g., https://www.kai-waehner.de/blog/2020/10/27/streaming-machine-learning-kafka-native-model-server-deployment-rpc-embedded-streams/). This is another good option for some use cases, but not as robust and fast.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants