Kafka Simplified with kcat (kafka cat) 🚀
Sergio Marcial
Posted on September 12, 2023
During the continuos travels to demystify Kafka there are multiple tools that can help us better understand the powerful event streaming platform;
Say hello to kcat, your trusty companion in the world of Apache Kafka! Whether you're just starting your journey or you're a seasoned pro looking to level up your Kafka skills, this guide might be able to provide you with a new tool for your toolbox starting with the installation of kcat on Unix, macOS, and Windows, some basic examples. Plus, we'll explore advanced examples, including custom authentication with SASL and keystore, to help you become a Kafka wizard. Buckle up, and let's dive into the world of Kafka with kcat! 🌟
Introducing kcat
Before we delve into the installation and usage, let's get to know kcat. It's a command-line utility that's part of the Confluent ecosystem, designed to simplify Kafka interactions. With kcat, you can consume, produce, and interact with Kafka topics seamlessly, making it an essential tool for software engineers working with real-time data streams.
Installation
Unix and macOS
Option 1: Package Manager (Recommended)
Unix-based systems, including macOS, offer the convenience of package managers for kcat installation. If you're on a system with APT (e.g., Debian/Ubuntu), run:
sudo apt-get install kcat
For systems using YUM (e.g., CentOS/Red Hat), use:
sudo yum install kcat
If you're using macOS with Homebrew, installation is as simple as:
brew install kcat
Option 2: Manual Installation
For those who prefer manual installation, visit the official Confluent Hub (https://docs.confluent.io/current/clients/confluent-kafka-python/html/index.html) to download the appropriate binary for your system. After downloading, follow these steps:
- Make the binary executable:
chmod +x kcat # Replace with the actual filename for your system
- Move it to a directory in your PATH (e.g.,
/usr/local/bin/
):
sudo mv kcat /usr/local/bin/ # Replace with the actual filename for your system
Windows
Windows users can also enjoy the power of kcat by downloading the Windows binary from the official Confluent Hub (https://docs.confluent.io/current/clients/confluent-kafka-python/html/index.html) and following these steps:
Download the Windows binary.
Rename the binary to
kcat.exe
.Add the directory containing
kcat.exe
to your system's PATH.
Now that you have kcat installed, let's explore its features with practical examples! 🛠️
Basic Usage
Kafka Message Consumption
Let's start with the basics by consuming messages from a Kafka topic. Suppose you have a Kafka topic named my_topic
, and your Kafka broker is running at kafka_broker_address
. You can use kcat to consume and display messages in real-time:
kcat -b kafka_broker_address -t my_topic
Replace kafka_broker_address
with the address of your Kafka broker. kcat will continuously display messages from my_topic
as they arrive.
Producing Kafka Messages
Producing messages to Kafka is just as straightforward. To send a message to a Kafka topic, you can use kcat like this:
echo "Hello, Kafka!" | kcat -P -b kafka_broker_address -t my_topic
This command sends the message "Hello, Kafka!" to my_topic
. The -P
flag indicates that we're producing a message.
Advanced Usage
Custom Consumer Groups
Kafka supports consumer groups for parallel message consumption. You can specify a custom consumer group with kcat:
kcat -b kafka_broker_address -t my_topic -g my_consumer_group
Replace my_consumer_group
with the name of your custom consumer group. kcat handles group coordination for you.
Consuming and Writing to a File
Often, you may want to consume messages from a Kafka topic and save them to a file for later analysis. You can achieve this with kcat:
kcat -b kafka_broker_address -t my_topic > output.txt
This command continuously consumes messages from my_topic
and appends them to output.txt
.
Filtering and Transforming Messages
kcat allows you to apply filtering and transformations to consumed messages using a simple pipe (|
) syntax. For instance, you can filter messages containing the word "error" like this:
kcat -b kafka_broker_address -t my_topic | grep "error"
This command displays only the messages containing "error," making it easier to focus on specific events.
Kafka Header Support
Kafka message headers provide additional metadata. kcat supports header manipulation. To send a message with a custom header, you can use the -H
option:
echo "Important message" | kcat -P -b kafka_broker_address -t my_topic -H "header_key=header_value"
This sends a message with a custom header to my_topic
.
Custom Authentication
Kafka often requires custom authentication mechanisms for secure communication. If your Kafka cluster uses SASL (Simple Authentication and Security Layer) and truststore/keystore certs, you can use kcat with the following options:
SASL
kcat -b kafka_broker_address -t my_topic \
-X security.protocol=SASL_SSL \
-X sasl.mechanisms=PLAIN
Truststore/Keystore
kcat -b kafka_broker_address -t my_topic \
-X ssl.ca.location=/path/to/ca.crt \
-X ssl.truststore.location=/path/to/client.truststore \
-X ssl.truststore.password=my_truststore_password
-X ssl.keystore.location=/path/to/client.keystore \
-X ssl.keystore.password=my_keystore_password
Replace the placeholders with your specific configurations and paths. kcat will handle the secure communication with SASL and SSL.
Supercharge Kafka with kcat and jq
Now, let's take it up a notch! Combine kcat with jq, a powerful JSON processor, to work with Kafka messages efficiently. Suppose you have JSON messages in your Kafka topic. You can consume, process, and display them using jq like this:
kcat -b kafka_broker_address -t my_json_topic -q | jq .
The -q
flag makes kcat output only the message payloads. We pipe this output to jq to pretty-print JSON messages.
Conclusion
With kcat in your arsenal, you're well-equipped to conquer Kafka's intricacies. Whether you're debugging, monitoring, or building real-time data processing pipelines, kcat is your Swiss Army knife for Kafka tasks. Start exploring its capabilities, install kcat on your system, and embark on your Kafka journey with confidence. Happy streaming! 🌊🪄✨
Posted on September 12, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.