Skip to main content

Message processing through Kafka

This guide explains how to set up a Kafka environment and process messages after creating a cluster using KakaoCloud's Advanced Managed Kafka service.

Basic information

About this scenario

KakaoCloud's Advanced Managed Kafka service provides fully managed Kafka clusters, making it easy to build high-performance data streaming and event processing systems. This tutorial explains how to create a Kafka cluster using Advanced Managed Kafka, set up a Kafka environment, and send and receive messages.

Key topics include:

  • Creating an Advanced Managed Kafka cluster
  • Setting up the Kafka environment
  • Creating a topic
  • Running producers and consumers to send and receive messages

Before you start

In this preparation phase, you'll configure the network, set up security groups, and create an Advanced Managed Kafka cluster.

1. Network configuration

  1. Go to KakaoCloud Console > Beyond Networking Service > VPC.

  2. Click [Create VPC] and configure as follows:

VPC: tutorial-amk-vpc
KeyValue
VPC nametutorial-amk-vpc
VPC IP CIDR block10.0.0.0/16
Availability Zone2
AZ 1kr-central-2-a
AZ 2kr-central-2-b
Public subnet (AZ1)10.0.0.0/20
Public subnet (AZ2)10.0.16.0/20

Click [Create].

2. Security group settings

  1. Go to KakaoCloud Console > Beyond Networking Service > VPC > Security Group.

  2. Create two security groups as follows:

Security group: tutorial-amk-sg
KeyValue
Nametutorial-amk-sg
ProtocolTCP
Source0.0.0.0/0
Port9092
Descriptioncluster inbound policy 1
Security group: tutorial-vm-sg
KeyValue
Nametutorial-vm-sg
ProtocolTCP
Source{Your Public IP}/32
Port22
Descriptionkafka inbound policy 1
Check your public IP

Click below to check your current public IP:

3. Create Advanced Managed Kafka cluster

  1. Go to KakaoCloud Console > Analytics > Advanced Managed Kafka.

  2. Click [Create Cluster] and configure as follows:

Basic Settings:

KeyValue
Cluster nametutorial-amk-cluster
Kafka version3.7.1
Port9092

Instance Type:

KeyValue
Instance typer2a.2xlarge

Network Settings:

KeyValue
VPCtutorial-amk-vpc
Subnetmain (10.0.0.0/20)
Security Grouptutorial-amk-sg

Broker Configuration:

KeyValue
Availability Zones2
Brokers per AZ2
Volume type/sizeSSD / 50GB
Max IOPS3000

Click [Create]. Verify the cluster status changes from StartingCreatingActive.

4. Create Kafka access VM

  1. Go to KakaoCloud Console > Beyond Compute Service > Virtual Machine.

  2. Click [Create instance] and configure as follows:

    KeyValue
    Nametutorial-amk-vm
    OSUbuntu 24.04
    Instance typem2a.large
    Root volume10GB / SSD
    Key pair{USER_KEYPAIR}
    VPCtutorial-amk-vpc
    Subnetmain (10.0.0.0/20)
    Security grouptutorial-vm-sg

Getting started

The detailed steps for configuring a Kafka environment and processing messages are as follows:

Step 1. Configure VM instance access

Assign a public IP to access the VM instance created earlier.

info

If a bad permissions error occurs due to key pair file permissions, you can resolve it using the sudo command.

Public IP Check
Check public IP

  1. Go to KakaoCloud Console > Virtual Machine**.

  2. In the Instances tab, click on the tutorial-amk-vm instance.

  3. Click [Instance Actions] > [Associate Public IP].

  4. In the Associate Public IP window, click [Confirm] without modifying anything.

    • The assigned public IP can be checked in the Instance List or under the Network tab in the instance details.
  5. Open the terminal on your local machine and navigate to the folder containing the key pair file.

    cd ~/Downloads
  6. Grant read permission to the key pair file:

    sudo chmod 400 ${PRIVATE_KEY}.pem
    Environment VariableDescription
    PRIVATE_KEY🖌Key pair file name
  7. Access the VM instance via SSH:

    ssh -i ${PRIVATE_KEY}.pem ubuntu@${TUTORIAL-AMK-VM_PUBLIC_IP}
    Environment VariableDescription
    PRIVATE_KEY🖌Key pair file name
    TUTORIAL-AMK-VM_PUBLIC_IP🖌Public IP of tutorial-amk-vm instance

Step 2. Set up Kafka environment

Set up the Kafka environment on the VM instance. Run the following commands in the terminal:

1. Install Java and configure environment variables

Kafka requires JDK. This guide uses OpenJDK 21.

sudo apt update
sudo apt install -y openjdk-21-jdk

cat << EOF | sudo tee -a /etc/profile
export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64
export PATH=\$JAVA_HOME/bin:\$PATH
export CLASSPATH=\$CLASSPATH:\$JAVA_HOME/lib/ext:\$JAVA_HOME/lib/tools.jar
EOF

source /etc/profile

2. Download and extract Kafka

info

Download Kafka using curl from the Apache official site. This guide uses Kafka 3.7.1.

curl https://archive.apache.org/dist/kafka/3.7.1/kafka_2.13-3.7.1.tgz -o kafka_2.13-3.7.1.tgz
tar -xzf kafka_2.13-3.7.1.tgz
rm kafka_2.13-3.7.1.tgz
mv kafka_2.13-3.7.1 kafka

Step 3. Create and check topics

Once Kafka is installed, navigate to the Kafka folder and create topics.

Bootstrap Server IP Check
Check bootstrap server IP

  1. Move to the Kafka folder using cd.

    cd kafka
  2. Create a topic using the following command:

    bin/kafka-topics.sh --create --topic ${TOPIC_NAME} --bootstrap-server ${HOST:PORT}
    Environment VariableDescription
    TOPIC_NAME🖌Specify topic name / Example: tutorial-topic
    HOST:PORT🖌Copy bootstrap server info from Advanced Managed Kafka > Cluster menu > tutorial-amk-cluster
  3. Describe the topic details:

     bin/kafka-topics.sh --describe --topic ${TOPIC_NAME} --bootstrap-server ${HOST:PORT}
    Environment VariableDescription
    TOPIC_NAME🖌Topic name to query / Example: tutorial-topic
    HOST:PORT🖌Copy bootstrap server info from Advanced Managed Kafka > Cluster menu > tutorial-amk-cluster

Step 4. Run producer client

After creating a topic, run the producer client to send messages. If you see > in the terminal, you can proceed.

bin/kafka-console-producer.sh --topic ${TOPIC_NAME} --bootstrap-server ${HOST:PORT}
Environment VariableDescription
TOPIC_NAME🖌Topic name for producer to send data / Example: tutorial-topic
HOST:PORT🖌Copy bootstrap server info from Advanced Managed Kafka > Cluster menu > tutorial-amk-cluster

Step 5. Run consumer client

Run the consumer client to verify message transmission between the producer and consumer in a new terminal window.

  1. Open a new terminal and navigate to the key pair folder:

    cd ~/Downloads
  2. Access the VM instance via SSH:

    ssh -i ${PRIVATE_KEY}.pem ubuntu@${TUTORIAL-AMK-VM_PUBLIC_IP}
    Environment VariableDescription
    PRIVATE_KEY🖌Key pair file name
    TUTORIAL-AMK-VM_PUBLIC_IP🖌Public IP of tutorial-amk-vm instance
  3. Navigate to the Kafka folder:

    cd kafka
  4. Run the consumer client:

     bin/kafka-console-consumer.sh --topic ${TOPIC_NAME} --bootstrap-server ${HOST:PORT}
    Environment VariableDescription
    TOPIC_NAME🖌Topic name for consumer to receive data / Example: tutorial-topic
    HOST:PORT🖌Copy bootstrap server info from Advanced Managed Kafka > Cluster menu > tutorial-amk-cluster

Step 6. Send and receive messages

  1. In the producer client terminal, type hello kafka after > and press Enter.

  2. Verify that the hello kafka message appears in the consumer client terminal.