Skip to content

Latest commit

 

History

History
175 lines (121 loc) · 5.02 KB

README.md

File metadata and controls

175 lines (121 loc) · 5.02 KB

GreenOpsStem

GreenOps system backend

Docker Compose

UP

docker-compose up --build ###DOWN docker-compose down

Services (as individual docker containers)

InboundTelemetryService

A rest API that accepts raw telemetry messages relays them to a kafka topic.

  • Input: REST endpoint http://localhost:80/process
  • Output: Kafka topic inbound-telemetry

Docker build

cd InboundTelemetryService/
docker build -t gos-inbound-telemetry-service .

Docker run

docker run -p 80:80 -e PYTHONUNBUFFERED=1 <image id>

curl command for testing:

curl --location 'http://localhost:80/process' \
--header 'Content-Type: application/json' \
--data '{
    "query":"sanity"
}'

Kafka consumer to listen on the service's output topic:

winpty docker exec -it greenopsstem-kafka-1 kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic inbound-telemetry --from-beginning

view logs:

docker-compose logs -f inbound-telemetry-service

TelemetryWritingService (DataWritingService)

  • Input: Kafka topic(s): inbound-telemetry
  • Output Mongo collection: (DB:collection) gos_mongo:inbound_telemetry

Docker build

cd DataWritingService/
docker build -t gos-telemetry-writing-service .

Docker run

docker run -e PYTHONUNBUFFERED=1 <image id>

(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)

Querying output Mongo collection of the service's output

Get all entries:

docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").inbound_telemetry.find().pretty()'

Get all entries sorted descending by timestamp:

docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").inbound_telemetry.find().sort({"timestamp": -1}).pretty()'

view logs:

docker-compose logs -f telemetry-writing-service

TelemetryIngestService

Read raw telemetry messages from a kafka topic, parses them and pushes the result into another kafka topic

  • Input: Kafka topic inbound-telemetry
  • Output: Kafka topic branch-energy

Docker build

cd InboundTelemetryService/
docker build -t gos-telemetry-ingest-service .

Docker run

docker run -p 80:80 -e PYTHONUNBUFFERED=1 <image id>

Kafka consumer to listen on the service's output topic:

winpty docker exec -it greenopsstem-kafka-1 kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic branch-energy --from-beginning

view logs:

docker-compose logs -f telemetry-ingest-service

BranchEnergyWritingService (DataWritingService)

  • Input: Kafka topic(s): branch-energy
  • Output Mongo collection: (DB:collection) gos_mongo:branch_energy

Docker build

cd DataWritingService/
docker build -t gos-branch-energy-writing-service .

Docker run

docker run -e PYTHONUNBUFFERED=1 <image id>

(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)

Querying output Mongo collection of the service's output

Get all entries:

docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().pretty()'

Get all entries sorted descending by payload_timestamp:

docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().sort({"payload_timestamp": -1}).pretty()'

Get all entries sorted descending by energy_timestamp:

docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().sort({"energy_timestamp": -1}).pretty()'

view logs:

docker-compose logs -f branch-energy-writing-service

BranchEnergyReadingService

  • Input: Kafka topic(s): branch-energy-data-request
  • Output: Kafka topic(s): branch-energy-data-response

Docker build

cd BranchEnergyReadingService/
docker build -t gos-branch-energy-reading-service .

Docker run

docker run -e PYTHONUNBUFFERED=1 <image id>

(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)

Kafka producer to push queries on the service's request (input) topic:

winpty docker exec -it greenopsstem-kafka-1 kafka-console-producer.sh --bootstrap-server localhost:9092 --topic branch-energy-data-request

query schema:

{
    "query_id": "ds90f8sd90f8sd09f8sdf",
    "repo_name": "sanity-repo",
    "branch_name": "sanity-branch"
}
{"query_id": "ds90f8sd90f8sd09f8sdf", "repo_name": "sanity-repo", "branch_name": "sanity-branch"}

Kafka consumer to listen on the service's response (output) topic:

winpty docker exec -it greenopsstem-kafka-1 kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic branch-energy-data-response --from-beginning

response schema:

{
    "query_id": query_id
    "repo_name": repo_name,
    "branch_name": branch_name,
    **"energy": energy**
}

view logs:

docker-compose logs -f branch-energy-reading-service