Part 2: Creating Topics, Consumers and Producers

Mohit Talniya
3 min readAug 23, 2018

--

Part 1: Kafka Core Concepts

Start by Installing Kafka on your dev environment. We would use a landoop’s Kafka Docker image.

$# docker run — rm -it \ -p 2181:2181 -p 3030:3030 -p 8081:8081 \
-p 8082:8082 -p 8083:8083 -p 9092:9092 \
-e ADV_HOST=127.0.0.1 \ landoop/fast-data-dev

You should see the Kafka Development Environment Dashboard by hitting 127.0.0.1:3030

Kafka Development Environment

Start a Kafka command line:

$#docker run — rm -it — net=host landoop/fast-data-dev bash

Create a Topic by providing zookeper, number of partitions and replication factor as discussed in Part-1.

root@fast-data-dev / $ kafka-topics — zookeeper 127.0.0.1:2181 — create — topic hello_topic — partitions 3 — replication-factor 1

Kafka Dashboard UI

List Kafka-topics:

root@fast-data-dev / $ kafka-topics — zookeeper 127.0.0.1:2181 — list

Kafka topic list

Let’s see the created hello_topic in details. You could see the 3 Partitions created along with the elected leader.

root@fast-data-dev / $ kafka-topics — zookeeper 127.0.0.1:2181 — describe — topic hello_topic

Describe Kafka Topic

Let’s start a producer and insert some data in the producer.

root@fast-data-dev / $ kafka-console-producer — broker-list 127.0.0.1:9092 — topic hello_topic

Check the Kafka Topics dashboard. You should see the data along with the partition and offset it was stored.

Data in Kafka Topics

Let’s spin a consumer with following command and add more data on the producer. You should see the data on the consumer side.

root@fast-data-dev / $ kafka-console-consumer — bootstrap-server 127.0.0.1:9092 — topic hello_topic

Coming up: Part 3: Kafka Integration in Spring Boot Microservices.

--

--

No responses yet