We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello this repo is very helpful, but it is 4 years old. Is this still the recommended way to use ML inference with Kafka Streams?
The text was updated successfully, but these errors were encountered:
Yes. Absolutely.
Embedding an analytic model is the appropriate way to do reliable model scoring with low latency.
Some model servers also add native Kafka interfaces (see, e.g., https://www.kai-waehner.de/blog/2020/10/27/streaming-machine-learning-kafka-native-model-server-deployment-rpc-embedded-streams/). This is another good option for some use cases, but not as robust and fast.
Sorry, something went wrong.
No branches or pull requests
Hello this repo is very helpful, but it is 4 years old.
Is this still the recommended way to use ML inference with Kafka Streams?
The text was updated successfully, but these errors were encountered: