CDC for Astra DB

CDC for Astra DB automatically captures changes in real time, de-duplicates the changes, and streams the clean set of changed data into Astra Streaming where it can be processed by client applications or sent to downstream systems.

Astra Streaming processes data changes via a Pulsar topic. By design, the Change Data Capture (CDC) component is simple, with a 1:1 correspondence between the table and a single Pulsar topic.

This doc will show you how to create a CDC connector for your Astra DB deployment and send change data to an Elasticsearch sink.

Enabling CDC for Astra DB will result in increased costs based on your Astra Streaming usage. See Astra Streaming pricing for Astra Streaming pricing and CDC for Astra DB for CDC metering rates.

Supported data structures

The following data types (with the associated AVRO type or logical-type) are supported for CDC for Astra DB:

  • ascii (string)

  • bigint (long)

  • blob (bytes)

  • boolean (boolean)

  • counter (long)

  • date (int)

  • decimal (cql_decimal)

  • double (double)

  • duration (cql_duration)

  • float (float)

  • inet (string)

  • int (int)

  • list (array)

  • map (map, only string-type keys are supported)

  • set (array)

  • smallint (int)

  • text (string)

  • time (long)

  • timestamp (long)

  • timeuuid (string)

  • tinyint (int)

  • uuid (string)

  • varchar (string)

  • varint (cql_varint / bytes)

Cassandra static columns are supported:

  • On row-level updates, static columns are included in the message value.

  • On partition-level updates, the clustering keys are null in the message key. The message value only has static columns on INSERT/UPDATE operations.

For columns using data types that are not supported, the data types are omitted from the events sent to the data topic. If a row update contains both supported and unsupported data types, the event will include only columns with supported data types.

AVRO interpretation

Astra DB keys are strings, while CDC produces AVRO messages which are structures. The conversion for some AVRO structures requires additional tooling that can result in unexpected output.

The table below describes the conversion of AVRO logical types. The record type is a schema containing the listed fields.

AVRO complex types
Name AVRO type Fields Explanation

collections

array

lists, sets

Sets and Lists are treated as AVRO type array, with the attribute items containing the schema of the array’s items.

decimal

record

BIG_INT, DECIMAL_SCALE

The Cassandra DECIMAL type is converted to a record with the cql_decimal logical type

duration

record

CQL_DURATION_MONTHS, CQL_DURATION_DAYS, CQL_DURATION_NANOSECONDS

The Cassandra DURATION type is converted to a record with the cql_duration logical type

maps

map

The Cassandra MAP type is converted to the AVRO map type, but the keys are converted to strings.
For complex types, the key is represented in JSON.

Limitations

CDC for Astra DB has the following limitations:

  • Does not manage table truncates.

  • Does not sync data available before starting the CDC agent.

  • Does not replay logged batches.

  • Does not manage time-to-live.

  • Does not support range deletes.

  • CQL column names must not match a Pulsar primitive type name (ex: INT32).

  • Does not support multi-region.

Creating a tenant and a topic

  1. In astra.datastax.com, select Create a Streaming Tenant.

  2. Enter the name for your new streaming tenant and select a provider.

    Create new tenant
  3. Select Create Tenant.

Use the default persistent and non-partitioned topic.

Astra Streaming CDC can only be used in a region that supports both Astra Streaming and Astra DB. See Regions for more information.

Creating a table

  1. In your database, create a table with a primary key column:

    CREATE TABLE IF NOT EXISTS <keyspacename>.tbl1 (key text PRIMARY KEY, c1 text);
  2. Confirm you created your table:

    • CQLSH

    • Result

    select * from ks1.tbl1;
    token@cqlsh> select * from ks1.tbl1;
    
     key | c1
    -----+----
    
    (0 rows)
    token@cqlsh>

Connecting to CDC for Astra DB

  1. In the Astra Portal, go to Databases, and then select your database.

  2. Click the CDC tab.

  3. Click Enable CDC.

  4. Complete the fields to connect CDC.

    Enable CDC
  5. Select Enable CDC. Once created, your CDC connector will appear:

    Confirm CDC Created
  6. Enabling CDC creates a new astracdc namespace with two new topics, data- and log-. The log- topic consumes schema changes, processes them, and then writes clean data to the data- topic. The log- topic is for CDC functionality and should not be used. The data- topic can be used to consume CDC data in Astra Streaming.

Connecting Elasticsearch sink

After creating your CDC connector, connect an Elasticsearch sink to it. DataStax recommends using the default Astra Streaming settings.

  1. Select the cdc-enabled table from the database CDC tab and click Add Elastic Search Sink to enforce the default settings.

  2. Select the corresponding data topic for the chosen table. The topic name will look something like this: data-64b406e3-28ec-4eaf-a802-69ade0415b58-ks1.tbl1.

  3. Use your Elasticsearch deployment to complete the fields. To find your Elasticsearch URL, navigate to your deployment within the Elastic Common Schema (ECS). Copy the Elasticsearch endpoint to the Elastic Search URL field.

    Find ECS URL
  4. Complete the remaining fields.

    Most values will auto-populate. These values are recommended:

    • Ignore Record Key as false

    • Null Value Action as DELETE

    • Enable Schema as true

      Connect ECS Sink
  5. When the fields are completed, select Create.

If creation is successful, <sink-name> created successfully appears at the top of the screen. You can confirm your new sink was created in the Sinks tab.

ECS Created

Sending messages

Let’s process some changes with CDC.

  1. Go to the CQL console.

  2. Modify the table you created.

    INSERT INTO <keyspacename>.tbl1 (key,c1) VALUES ('32a','bob3123');
    INSERT INTO <keyspacename>.tbl1 (key,c1) VALUES ('32b','bob3123b');
  3. Confirm the changes you’ve made:

    token@cqlsh> select * from ks1.tbl1;
    
     key | c1
    -----+----------
     32a |  bob3123
     32b | bob3123b
    
    (2 rows)

Confirming ECS is receiving data

To confirm ECS is receiving your CDC changes, issue a curl GET request to your ECS deployment.

  1. Get your index name from your ECS sink tab:

    ECS Index
  2. Issue your curl GET request with your Elastic username, password, and index name:

    curl  -u <username>:<password>  \
       -XGET "https://asdev.es.westus2.azure.elastic-cloud.com:9243/<index_name>/_search?pretty"  \
       -H 'Content-Type: application/json'

    If you’re using a trial account, the username is elastic.

You will receive a JSON response with your changes to the index, which confirms Astra Streaming is sending your CDC changes to your ECS sink.

{
    "_index" : "index.tbl1",
    "_type" : "_doc",
    "_id" : "32a",
    "_score" : 1.0,
    "_source" : {
        "c1" : "bob3123"
    }
},
{
    "_index" : "index.tbl1",
    "_type" : "_doc",
    "_id" : "32b",
    "_score" : 1.0,
    "_source" : {
        "c1" : "bob3123b"
    }
}

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com