Search tips
Explanation of how the connector ingests topics to DataStax Enterprise database tables.
Components of a DataStax Apache Kafka Connector implementation.
Writes to the message queue
Overview of the Apache Kafka™ topic data pipeline.
Writes to DataStax
Data from the Kafka topic is written to the mapped DataStax platform database table using a batch request containing multiple write statements.
System requirements
Vary depending on the workload and network capacity.
Supported versions
Kafka and DataStax platform compatibility matrix.
Release notes for DataStax Apache Kafka Connector.
Install on Linux-based platform using a binary tarball.
Configuring the connector.
Adjusting the number of tasks, simultaneous writes, and batch size.
Reading serialized bytes
Configure the worker to deserialize messages using the converter that corresponds to the producer's serializer.
Specify writetime timestamp column
Optionally specify the column to use for the writetime timestamp when inserting records from Kafka into DSE or DDAC database tables.
Setting row-level TTL
Set row-level TTL from Kafka fields.
Topic to tables
Simple but powerful syntax for mapping Kafka fields to DataStax database table columns.
Use the sample configuration files as a starting point.
Maintaining and operating the DataStax Apache Kafka Connector.
About maintenance tasks
Use the Kafka Connect REST API to operate and maintain the DataStax Connector.
Start the connector from the Kafka installation directory.
Scaling the DataStax Connector
Respond to increases or decreases in workload.
Schema changes
How to update the DataStax Connector when schema changes are required.
Verify data
Verify that data from a mapped Kafka topic was written to the database table column.
Restart connector
Restart the DataStax Apache Kafka™ Connector .
Restart tasks
Restart DataStax Apache Kafka Connector tasks.
Stops the tasks that the DataStax Apache Kafka™ Connector is running without removing the configuration from the worker.
After a pause resume the DataStax Apache Kafka™ Connector .
Display configuration
Display configuration of a running connector.
Update configuration
Change the configuration of a running connector.
Remove the DataStax Apache Kafka™ Connector and all of its tasks.
Displaying the status.
Configure security between the DataStax Connector and the DataStax cluster.
Use metrics reported for both the Kafka Connect Workers and the DataStax Apache Kafka Connector by using Java Management Extension MBeans to monitor the connector.
Enable remote connections
Allow remote JMX connections to monitor DataStax Apache Kafka Connector activity.
DataStax Connector metrics
Use JMX to monitor the DataStax Connector.
Kafka Connect metrics
Find answers to common issues and errors.
Connector not found
DataStax Apache Kafka connector fails to start when registering a configuration with the worker.
Record processing
Determine if the connector is processing records.
Record fails to write
Records never appear in the DataStax Enterprise database table.
Writing fails because of mutation size
Batches from Kafka Connector are rejected when max size exceeded.
Data parsing fails
Data conversion fails.
Missing field error
If a Kafka record is missing fields that are set in the topic-table mapping data parsing fails.
Bind address already in use
When another process is already using the port.
Loading balancing datacenter is not specified
DataStax Apache Kafka Connector fails to start because the load balancing DC is not configured.
Step-by-step implementation for test or demonstration environments running Apache Kafka and DataStax database on the same system.
Setting up Kafka and the connector
Install DataStax Apache Kafka Connector and configure the cycling comments topic.
Setting up the DataStax Enterprise database
Create a cycling keyspace with comments table on the DataStax Enterprise database.
DataStax Connector configuration
Inserting data from a JSON file into the Kafka topic
Use the Apache Kafka producer to stream data into the cyclingComments topic.
Verifying records processed and writes
Ensure that the tutorial data was received by Kafka and records were processed by the connector.