Setting Up a Kafka Consumer in Node.js

What is a Kafka Consumer?

A Kafka Consumer is an essential component of Apache Kafka, a distributed streaming platform. It subscribes to Kafka topics and processes the stream of records produced to those topics, enabling applications to act upon or analyze the data in real-time.

Kafka Consumers are key in scenarios requiring real-time data processing. They are highly scalable and provide fault-tolerant methods to handle massive streams of data.

Use Cases:

  • Real-Time Data Processing: Consuming and processing data for real-time applications, such as analytics or monitoring systems.
  • Microservices Communication: Using Kafka as a message broker between different microservices.
  • Log Aggregation: Reading and processing logs from multiple systems for centralized analysis.

Setting Up a Kafka Consumer in Node.js


  • Node.js (preferably the latest LTS version)
  • Apache Kafka (Installation guide: Apache Kafka Quickstart)
  • kafka-node library or similar (Install using npm install kafka-node)

Step-by-Step Guide

Step 1: Install Kafka Node.js Client

First, install a Kafka client library for Node.js, such as kafka-node:

npm install kafka-node

Step 2: Create Kafka Consumer Configuration

Set up the consumer configuration in your Node.js application:

const kafka = require("kafka-node");
const Consumer = kafka.Consumer;
const client = new kafka.KafkaClient({ kafkaHost: "localhost:9092" });
const consumer = new Consumer(client, [{ topic: "myTopic", partition: 0 }], {
  autoCommit: false,

Step 3: Implement Message Processing Logic

Define the logic for processing messages received from Kafka:

consumer.on("message", function (message) {
  console.log("Received Message:", message);

consumer.on("error", function (err) {
  console.log("Error:", err);

Step 4: Run Your Node.js Application

Run your application to start consuming messages:

node your-consumer-app.js


Setting up a Kafka Consumer in Node.js enables your application to consume and process data from Kafka in real time. This setup is essential for building Node.js applications that require scalable and efficient data processing.

For advanced consumer configurations and usage, including handling rebalancing and committing offsets, consult the documentation of your chosen Kafka Node.js client library (like kafka-node). This guide provides the foundational steps to integrate Kafka Consumers into your Node.js applications.