The service reads AVRO payload from the filesystem and publishes it to the internal and cloud Kafka clusters.
Task:
-
Reading files from the filesystem when they arrive
-
Parsing the data in the files
-
Serializing that data in preparation for publishing to Kafka clusters
-
Creating a Kafka client
-
Publishing to multiple internal clusters
-
Publishing to a Kafka cluster running on a cloud platform
-
Removing the files from the filesystem when their contents have been published
-
Providing an SDK for use by other teams in your startup for some of the key, high-level functionality
-
Presentation
make run
make generate-mock
Sample:
{
"Id" : "fC30BffaBE04",
"Reference" : "azDUa",
"Amount" : {
"Value" : -231.51184,
"Currency" : "USD"
},
"BookedTime" : "1609911988"
}
This will generate a file in the /tmp/ingest/
folder
make fmt lint test
make test-ci
make help
Name | Description |
---|---|
HTTP_PORT |
HTTP port to serve from |
KAFKA_SERVERS |
List of Kafka brokers (comma separated) |
KAFKA_SCHEMA_SERVERS |
List of Kafka Schema servers. (comma separated) |
KAFKA_CLOUD_SERVERS |
List of CCloud servers |
KAFKA_CLOUD_SCHEMA_SERVERS |
List of CCloud Schema Registry servers |
KAFKA_CLOUD_KEY |
CCloud authentication key |
KAFKA_CLOUD_SECRET |
CCloud authentication secret |
The files located in the /docs
folder
These are my notes I made during my Kafka research
Available in the /docs
folder, or click here