Note: This is a Java version of the of the previously published Scala version of the guide described below. This is a work in progress, which also provides a Maven version for building and deploying the sample pipeline.
Building streaming pipelines to move data from one source to another—so it can be stored, used for analytics, or combined with other data—can get to be quite complicated. We've made developing streaming applications a lot simpler with Akka Data Pipelines, so that you can be more productive when you're use cases include the following examples:
- Internet of Things (IoT) applications to collect metrics from sensors or devices
- Digital twins of devices to monitor power equipment or connected cars
- Data pipelines that stream to a machine learning algorithm
To help get you started, we have a new Akka Data Pipelines IoT Sensor Tutorial to bring you with a hands-on coding experience.
In case you missed it, Akka Data Pipelines is the enterprise packaging of Lightbend's OSS Cloudflow project, which provides Lightbend Telemetry for insights into your running system as well as world-class support and guidance from the Lightbend engineering teams.
By leveraging Akka Data Pipelines from Lightbend, you can accelerate development and decrease risks. To illustrate how we can help you build streaming systems faster and easier, this Akka Data Pipelines IoT Sensor Tutorial covers the following topics and domains:
- What is Akka Data Pipelines, and why you should care
- Setting up a local Kubernetes test environment using MicroK8s
- Installing Akka Data Pipelines
- Testing and running a sample application that tracks IoT sensor data for working windmills
- Monitoring the IoT PoC with Lightbend Telemetry and Console
- Finally, we walk you through the code base of the sample application
You can find the Akka Data Pipelines Tutorial here.
This is a repository for Java, you can find the Scala one here.
- Upgrade CF to 2.2.2, properly use the new inlet API
- Add Java Akka gRPC Client - streaming request/response ("provideStreamed" API)
- Add Java Akka gRPC Client - request/response ("provide" API)
- Upgrade Kafka to 2.8.0 with Strimzi
- Releasing Java version to the public.
- Maven can be used here instead of SBT. For more infomation on using Maven with Cloudflow please see this.
https://github.com/fullstorydev/grpcurl
grpcurl -plaintext localhost:3000 list
SensorDataService grpc.reflection.v1alpha.ServerReflection
grpcurl -plaintext localhost:3000 describe SensorDataService
SensorDataService is a service:
service SensorDataService {
rpc Provide ( .SensorData ) returns ( .SensorDataReply );
rpc ProvideStreamed ( stream .SensorData ) returns ( stream .SensorDataReply );
}
grpcurl -plaintext -d '{"deviceId":"c75cb448-df0e-4692-8e06-0321b7703992","timestamp":1495545646279,"measurements":{"power":1.7,"rotorSpeed":3.9,"windSpeed":105.9}}'
localhost:3000 SensorDataService/Provide
{
"deviceId": "c75cb448-df0e-4692-8e06-0321b7703992",
"success": true
}
grpcurl -plaintext -d '{"deviceId":"c75cb448-df0e-4692-8e06-0321b7703992","timestamp":1495545646279,"measurements":{"power":1.7,"rotorSpeed":3.9,"windSpeed":105.9}}'
localhost:8080 SensorDataService/Provide
grpcurl -plaintext -d '{"deviceId":"dev1","timestamp":1495545646279,"measurements":
{"power":-1.7,"rotorSpeed":3.9,"windSpeed":105.9}}'
localhost:3000 SensorDataService/Provide
grpcurl -plaintext -d '{"deviceId":"c75cb448-df0e-4692-8e06-0321b7703992","timestamp":1495545646279,"measurements":{"power":1.7,"rotorSpeed":-3.9,"windSpeed":105.9}}'
localhost:8080 SensorDataService/Provide
equivalent of sbt:runLocal
mvn clean package cloudflow:extract-streamlets cloudflow:verify-blueprint cloudflow:app-layout cloudflow:run-local
equivalent of sbt:buildApp
mvn \
clean \
package \
cloudflow:extract-streamlets \
docker:build \
cloudflow:push-images \
-Ddocker.username=${DOCKER_USERNAME} \
-Ddocker.password=${DOCKER_PASSWORD} \
-DskipTests
mvn cloudflow:build-app