Featured image of post Getting Started with Microservices in Node.js Using Kafka

Getting Started with Microservices in Node.js Using Kafka

Microservices in Node.js Using Kafka

Microservices architecture has become a popular approach for building scalable and maintainable applications. By breaking down applications into smaller, independent services, teams can develop, deploy, and scale each service independently. When coupled with Apache Kafka, a powerful distributed streaming platform, the advantages of microservices are further amplified. This article will guide you through the initial steps of getting started with microservices in Node.js using Kafka.

Why Microservices?

Before diving into the technical details, it’s essential to understand why microservices can be beneficial:

Scalability:

Each microservice can be scaled independently based on its specific demand.

Resilience:

Failures are isolated to individual services, reducing the impact on the overall system.

Flexibility:

Different services can be developed using different technologies best suited for each specific task.

Faster Development:

Smaller, focused teams can work on individual services, speeding up the development process.

Why Apache Kafka?

Apache Kafka is a distributed event streaming platform capable of handling trillions of events a day. It’s a perfect fit for microservices due to:

High Throughput:

Kafka can handle a large number of events per second, making it ideal for real-time data processing.

Scalability:

Kafka’s distributed nature allows it to scale horizontally.

Durability:

Kafka replicates data across multiple servers for fault tolerance.

Integration:

Kafka has connectors for various data sources and sinks, making integration with other systems straightforward.

Setting Up Your Environment

Prerequisites

Node.js:

Make sure you have Node.js installed. You can download it from the official website.

Kafka:

Kafka can be downloaded and installed from the Apache Kafka website.

Project Initialization

First, create a new Node.js project:

1
2
3
mkdir nodejs-microservices-kafka
cd nodejs-microservices-kafka
npm init -y

Installing Required Packages

We’ll need the kafkajs library to interact with Kafka:

1
npm install kafkajs

Creating a Kafka Producer

Producers are responsible for sending records to Kafka topics. Let’s create a simple Kafka producer in Node.js.

Create a file named producer.js:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'my-producer',
  brokers: ['localhost:9092']
});

const producer = kafka.producer();

const run = async () => {
  await producer.connect();
  await producer.send({
    topic: 'test-topic',
    messages: [
      { value: 'Hello KafkaJS user!' },
    ],
  });

  await producer.disconnect();
};

run().catch(console.error);

In this code:

We initialize a Kafka client. Connect the producer. Send a message to the topic test-topic. Disconnect the producer.

Creating a Kafka Consumer

Consumers read records from Kafka topics. Let’s create a simple Kafka consumer in Node.js.

Create a file named consumer.js:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'my-consumer',
  brokers: ['localhost:9092']
});

const consumer = kafka.consumer({ groupId: 'test-group' });

const run = async () => {
  await consumer.connect();
  await consumer.subscribe({ topic: 'test-topic', fromBeginning: true });

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      console.log({
        partition,
        offset: message.offset,
        value: message.value.toString(),
      });
    },
  });
};

run().catch(console.error);

In this code:

We initialize a Kafka client. Connect the consumer. Subscribe to the test-topic topic. Print messages to the console as they are received.

Running the Producer and Consumer

Make sure your Kafka server is running. You can start Kafka by following the instructions on the Apache Kafka quick start guide.

First, start the consumer:

1
node consumer.js

Then, run the producer to send a message:

1
node producer.js

You should see the message received by the consumer printed to the console.

Integrating Microservices

In a real-world application, each microservice would likely have its own producer and consumer, interacting with multiple Kafka topics. Here’s a simplified example:

User Service:

Produces events when a user signs up.

Email Service:

Consumes user sign-up events and sends a welcome email.

User Service Example

Create a new file named userService.js:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'user-service',
  brokers: ['localhost:9092']
});

const producer = kafka.producer();

const signUpUser = async (username) => {
  await producer.connect();
  await producer.send({
    topic: 'user-signups',
    messages: [
      { value: JSON.stringify({ username }) },
    ],
  });
  await producer.disconnect();
};

signUpUser('new_user').catch(console.error);

Email Service Example

Create a new file named emailService.js:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'email-service',
  brokers: ['localhost:9092']
});

const consumer = kafka.consumer({ groupId: 'email-group' });

const run = async () => {
  await consumer.connect();
  await consumer.subscribe({ topic: 'user-signups', fromBeginning: true });

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      const user = JSON.parse(message.value.toString());
      console.log(`Sending welcome email to ${user.username}`);
    },
  });
};

run().catch(console.error);

Running the Services

Start the email service:

1
node emailService.js

Then, simulate a user sign-up by running the user service:

1
node userService.js

You should see a message indicating that a welcome email is being sent.

Conclusion

By using Kafka with Node.js, you can create scalable and resilient microservices that communicate through a high-throughput event streaming platform. This architecture allows for independent development, deployment, and scaling of services, enabling teams to build robust and flexible systems. With the basics covered, you can now start exploring more advanced Kafka features and dive deeper into the world of microservices. Happy coding!