Skip to content

A package that enables applications to publish, consume and process high volumes of records streams in a fast and durable way.

License

Notifications You must be signed in to change notification settings

mdmuhtasimfuadfahim/kafka-pub-sub

Repository files navigation

Gitpod

NPM VersionNPM Monthly Downloads

Last Commit Language Most Used Implementations Repository Size

Forks Stars Watches

Author GitHub

kafka-pub-sub 👋

kafka-pub-sub is designed based on KafkaJS with the support of stream processing that enables applications to publish, consume and process high volumes of record streams in a fast and durable way.

🔖 Table Of Contents


🌱 Prerequisites

  • NPM/Yarn LTS
  • NodeJs

Back To The Top


⏬ Installing

💻 Desktop

If you use Linux, try run commands bellow as sudo

npm i kafka-pub-sub

or

yarn add kafka-pub-sub

Back To The Top


👨‍💻 Example

Project stucture

project-structure

Note: Create NodeJS environment in both service-1 and service-2 and create server using your favourite NodeJS framewwork.

service-1/server.js

const ProduceEvent = require('kafka-pub-sub/ProduceEvent');

app.post('/api', async (req, res) => {
    const fakeData = {
        Name: "Md. Muhtasim Fuad Fahim",
        Email: "[email protected]",
    };

    const headers = { 
        'correlation-id': `1-${Date.now()}`,
        'system-id': 'my-system-id'
    };
    const producedEvent = await ProduceEvent('TEST_TOPIC', 'TEST_EVENT', fakeData, headers)
    console.log(producedEvent)

    return res.status(200).send("Done!");
});

service-2/server.js

const ConsumeEvent = require('kafka-pub-sub/ConsumeEvent');

(async function consumedEvent() {
    const consumedData = await ConsumeEvent('TEST_TOPIC')
    console.log(consumedData);
})();

service-1/.env & service-2/.env

KAFKA_CLIENT_ID=test-client
KAFKA_BROKER_URL=localhost:9092
KAFKA_GROUP_ID=test-group

sample docker-compose.yml

version: '2.1'
services:
  zookeeper:
    hostname: zookeeper
    container_name: zookeeper
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
  kafka:
    hostname: kafka
    container_name: kafka
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_BROKER_URL_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
      KAFKAJS_NO_PARTITIONER_WARNING: 1
      KAFKA_NUM_PARTITIONS: '6'

Now run the both services in your machine and hit the API. 🥳

Back To The Top


👌 Test

  • Fork it 😎
  • Clone forked repository: git clone https://github.com/username/forked-name.git
  • Install the dependencies from root directory: npm install
  • Rename .env.example to .env
  • Now run: npm run test & see the results 😎

💡 How To Contribute

  • Fork it 😎
  • Create a feature branch: git checkout -b my-feature
  • Add your changes: git add .
  • Commit your changes: git commit -m 'My new feature'
  • Push to the branch: git push origin my-feature
  • Submit a pull request

Contributions, issues and features requests are welcome!
📮 Submit PRs to help solve issues or add features
🐛 Find and report issues
🌟 Star the project

Back To The Top


📈 Project Activity

Alt

Back To The Top


👤 Author

🤓 Md. Muhtasim Fuad Fahim [email protected]

Back To The Top


🔏 License

Copyright © 2023 Md. Muhtasim Fuad Fahim

This project is licensed by MIT License.

Back To The Top


About

A package that enables applications to publish, consume and process high volumes of records streams in a fast and durable way.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published