A repository of code samples for Hazelcast Jet. The samples show you how to use the Pipeline API to solve a range of use cases, how to integrate Jet with other systems and how to connect to various data sources (both from a Hazelcast IMDG and 3rd-party systems). There is also a folder with samples using the Core API.
- apply a sliding window
- perform basic aggregation (counting)
- print the results on the console
- like the above, plus:
- add a second-level aggregation stage to find top/bottom N results
- apply a session window
- use a custom Core API processor as the event source
- perform a composite aggregate operation (apply two aggregate functions in parallel).
- print the results on the console
- use the
SourceBuilder
to create a mock source of trade events from a stock market - apply a tumbling window, configure to emit early results
- aggregate by summing a derived value
- present the results in a live GUI chart
- use
SourceBuilder
to create a mock source of trade events from a stock market - simple rolling aggregation (summing the price)
- keep updating the target map with the current values of aggregation
- present the results in a live GUI chart
- use an
IMap
as the data source - stateless transforms to clean up the input (flatMap + filter)
- perform basic aggregation (counting)
- print a table of the most frequent words on the console
- serialize a small dataset to use as side input
- fork a pipeline stage into two downstream stages
- stateless transformations to clean up input
- count distinct items
- group by key, then group by secondary key
- aggregate to a map of (secondary key -> result)
- hash-join the forked stages
- open an interactive GUI to try out the results
- co-group three bounded data streams on a common key
- for each distinct key, emit the co-grouped items in a 3-tuple of lists
- store the results in an
IMap
and check they are as expected
- use the Event Journal of an
IMap
as a streaming source - apply a sliding window
- co-group three unbounded data streams on a common key
- print the results on the console
- see here
- the sample is in the
enrichUsingIMap()
method - use the Event Journal of an
IMap
as a streaming data source - apply the
mapUsingIMap
transform to fetch the enriching data from anotherIMap
- enrich from two
IMap
s in twomapUsingIMap
steps - print the results on the console
- the sample is in the
enrichUsingReplicatedMap()
method - use the Event Journal of an
IMap
as a streaming data source - apply the
mapUsingReplicatedMap
transform to fetch the enriching data from anotherIMap
- enrich from two
ReplicatedMap
s in twomapUsingReplicatedMap
steps - print the results on the console
- the sample is in the
enrichUsingAsyncService()
method - prepare a mock data server: a gRPC-based network service
- use the Event Journal of an
IMap
as a streaming data source - enrich the unbounded data stream by making async gRPC calls to the mock service
- print the results on the console
- the sample is in the
enrichUsingHashJoin()
method - use the Event Journal of an
IMap
as a streaming data source - use a directory of files as a batch data source
- hash-join an unbounded stream with two batch streams in one step
- print the results on the console
- XML Configuration
- YAML Configuration
- Logging Configuration
- Configure Fault Tolerance
- Enterprise Feature: Configure SSL
- Suspend/Resume a Job
- Restart/Rescale a Job
- Inspect and Manage Existing Jobs
- Idempotently Submit a Job
- submit a job with the same name to two Jet members
- result: only one job running, both clients get a reference to it
- Live-Update a Running Job's Code
- IMap as Source and Sink
- IMap in a Remote IMDG as Source and Sink
- Projection and Filtering Pushed into the IMap Source
- ICache as Source and Sink
- IList as Source and Sink
- Event Journal of IMap as a Stream Source
- variant with IMap in a remote cluster
- Event Journal of ICache as a Stream Source
- variant with ICache in a remote cluster
- Data Enrichment Using IMap or ReplicatedMap
- Kafka Source
- variant with Avro Serialization
- variant with Json Serialization
- Kafka Sink
- Hadoop Distributed File System (HDFS) Source and Sink
- variant with Avro Serialization
- JDBC Source
- JDBC Sink
- JMS Queue Source and Sink
- JMS Topic Source and Sink
- TCP/IP Socket Source
- TCP/IP Socket Sink
- File Batch Source
- use Jet to analyze an HTTP access log file
- variant with Avro serialization
- File Streaming Source
- analyze the data being appended to log files while the Jet job is running
- File Sink
- variant with Avro serialization
- Custom Source:
- start an Undertow HTTP server that collects basic JVM stats
- construct a custom Jet source based on Java 11 HTTP client
- apply a sliding window
- compute linear trend of the JVM metric provided by the HTTP server
- present the results in a live GUI chart
- Custom Sink
- construct a custom Hazelcast
ITopic
sink
- construct a custom Hazelcast
- Custom Side Input for Data Enrichment
- look at the
enrichUsingReplicatedMap()
method
- look at the
- Annotation-Based Spring Context
- use programmatic Jet configuration in a Spring Application Context class
- annotation-based dependency injection into a Jet Processor
- XML-Based Spring Context
- configure Jet as a Spring bean in application-context.xml
- XML-based dependency injection into a Jet Processor
- XML-Based Dependency Injection into a Jet
Processor
- configure Jet as a Spring bean using Jet's XML Schema for Spring Configuration
- XML-based dependency injection into a Jet Processor
- Spring Boot App that Runs a Jet Job
Use a pre-trained TensorFlow model to enrich a stream of movie reviews. The model classifies natural-language text by the sentiment it expresses.
- Pivotal Cloud Foundry
- Spring Boot Application that starts Hazelcast Jet in the Pivotal Cloud Foundry environment
- Docker
Makefile
that usesdocker-compose
to deploy Jet into a Docker container- Docker Cloud Stack File that describes Jet to Docker
- Kubernetes
Code samples that show how to use the low-level Core DAG API directly. See the dedicated README for more details.
Hazelcast is available under the Apache 2 License. Please see the Licensing section for more information.
Copyright (c) 2008-2019, Hazelcast, Inc. All Rights Reserved.
Visit www.hazelcast.com for more info.