Starlight for Kafka
Starlight for Kafka brings native Apache Kafka® protocol support to Apache Pulsar®, enabling migration of existing Kafka applications and services to Pulsar without modifying the code. Kafka applications can now leverage Pulsar’s powerful features, such as:
-
Streamlined operations with enterprise-grade multi-tenancy
-
Simplified operations with a rebalance-free architecture
-
Infinite event stream retention with Apache BookKeeper™ and tiered storage
-
Serverless event processing with Pulsar Functions
By integrating two popular event streaming ecosystems, Starlight for Kafka unlocks new use cases and reduces barriers for users adopting Pulsar. Leverage advantages from each ecosystem and build a truly unified event streaming platform with Starlight for Kafka to accelerate the development of real-time applications and services.
This document will help you get started producing and consuming Kafka messages on a Pulsar cluster.
Starlight for Kafka Quickstart
-
To start connecting Starlight for Kafka, select Kafka in the Astra Streaming Connect tab.
-
When the popup appears, confirm you want to enable Kafka on your tenant.
You can’t remove the Kafka namespaces created on your tenant with this step. You must remove the tenant itself to remove these namespaces.
-
Select Enable Kafka to create a configuration file and the following three namespaces in your Astra Streaming tenant:
-
kafka
for producing and consuming messages -
__kafka
for functionality -
__kafka_unlimited
for storing metadata
-
-
Save the configuration to a
ssl.properties
file:ssl.propertiesusername: TENANT_NAME password: token: bootstrap.servers: kafka-aws-useast2.streaming.datastax.com:9093 schema.registry.url: https://kafka-aws-useast2.streaming.datastax.com:8081 security.protocol: SASL_SSL sasl.mechanism: PLAIN
The configuration details depend on your Astra Streaming tenant configuration.
Connect Kafka and Pulsar
This example uses tools included with the Apache Kafka tarball.
-
Create a new topic in your
kafka
namespace. This example creates a topic namedtest-topic
in thekafka
namespace on a tenant namedtenant-1
. -
Move your
ssl.properties
file to yourKafka_2.13-3.1.0/config
folder. These values are required for SSL encryption. For this example, the values are as follows:bootstrap.servers=kafka-aws-useast2.streaming.datastax.com:9093 security.protocol=SASL_SSL sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='tenant-1' password='token:{pulsar tenant token}' sasl.mechanism=PLAIN session.timeout.ms=45000
-
In the
Kafka
directory, create a Kafka producer to produce messages ontenant-1/kafka/test-topic
.bin/kafka-console-producer --broker-list kafka-aws-useast2.streaming.datastax.com:9093 --topic tenant-1/kafka/test-topic --producer.config config/ssl.properties
Once the producer is ready, it accepts standard input from the user:
Result>hello pulsar
-
In a new terminal window, create a Kafka consumer to consume messages from the beginning of
tenant-1/kafka/test-topic
:bin/kafka-console-consumer --bootstrap-server kafka-aws-useast2.streaming.datastax.com:9093 --topic tenant-1/kafka/test-topic --consumer.config config/ssl.properties --from-beginning hello pulsar
-
Send a few messages, and then return to your
kafka
namespace dashboard in Astra Streaming to monitor your activity.Your Kafka messages are being produced and consumed in a Pulsar cluster: