• Glossary
  • Support
  • Downloads
  • DataStax Home
Get Live Help
Expand All
Collapse All

DataStax Enterprise 6.8 Security Guide

    • About DSE Advanced Security
    • Security FAQs
    • Security checklists
    • Securing the environment
      • Securing ports
      • Securing the TMP directory
    • Authentication and authorization
      • Configuring authentication and authorization
        • About DSE Unified Authentication
          • Steps for new deployment
          • Steps for production environments
        • Configuring security keyspaces
        • Setting up Kerberos
          • Kerberos guidelines
          • Enabling JCE Unlimited
            • Removing AES-256
          • Preparing DSE nodes for Kerberos
            • DNS and NTP
            • krb5.conf
            • Principal
            • Keytab
        • Enabling authentication and authorization
          • Defining a Kerberos scheme
          • Defining an LDAP scheme
        • Configuring JMX authentication
        • Configuring cache settings
        • Securing schema information
      • Managing database access
        • About RBAC
        • Setting up logins and users
          • Adding a superuser login
          • Adding database users
          • LDAP users and groups
            • LDAP logins
            • LDAP groups
          • Kerberos principal logins
          • Setting up roles for applications
          • Binding a role to an authentication scheme
        • Assigning permissions
          • Database object permissions
            • Data resources
            • Functions and aggregate resources
            • Search indexes
            • Roles
            • Proxy login and execute
            • Authentication schemes
            • DSE Utilities (MBeans)
            • Analytic applications
            • Remote procedure calls
          • Separation of duties
          • Keyspaces and tables
          • Row Level Access Control (RLAC)
          • Search index permissions
          • DataStax Graph keyspace
          • Spark application permissions
          • DataStax Studio permissions
          • Remote procedure calls
          • DSE client-tool spark
          • JMX MBean permissions
          • Deny (denylist) db object permission
          • Restricting access to data
      • Providing credentials from DSE tools
        • About clients
        • Internal and LDAP authentication
          • Command line
          • File
          • Environment variables
          • Using CQLSH
        • Kerberos
          • JAAS configuration file location
          • Keytab
          • Ticket Cache
          • Spark jobs
          • SSTableLoader
          • Graph and gremlin-console
          • dsetool
          • CQLSH
        • Nodetool
        • JConsole
    • Auditing database activity
      • Enabling database auditing
      • Capturing DSE Search HTTP requests
      • Log formats
      • View events from DSE audit table
    • Transparent data encryption
      • About Transparent Data Encryption
      • Configuring local encryption
        • Setting up local encryption keys
        • Encrypting configuration file properties
        • Encrypting system resources
        • Encrypting tables
        • Rekeying existing data
        • Using tools with TDE-encrypted SSTables
        • Troubleshooting encryption key errors
      • Configuring KMIP encryption
      • Encrypting Search indexes
        • Encrypting new Search indexes
        • Encrypting existing Search indexes
        • Tuning encrypted Search indexes
      • Migrating encrypted tables from earlier versions
      • Bulk loading data between TDE-enabled clusters
    • Configuring SSL
      • Steps for configuring SSL
      • Creating SSL certificates, keystores, and truststores
        • Remote keystore provider
        • Local keystore files
      • Securing node-to-node connections
      • Securing client-to-node connections
        • Configuring JMX on the server side
        • nodetool, nodesync, dsetool, and Advanced Replication
        • JConsole (JMX)
        • SSTableloader
        • Connecting to SSL-enabled nodes using cqlsh
      • Enabling SSL encryption for DSEFS
      • Reference: SSL instruction variables
    • Securing Spark connections
  • DataStax Enterprise 6.8 Security Guide
  • Transparent data encryption
  • Configuring local encryption
  • Using tools with TDE-encrypted SSTables

Using tools with TDE-encrypted SSTables

Introduction

This topic explains the steps to get tools working with SSTables that are encrypted with Transparent Data Encryption (TDE). You may need to set one or more values, most likely by exporting environment variables in the shell from which you run the tools like sstabledump, sstablerepairedset, and sstableloader.

Steps to get tools working with TDE-encrypted SSTables

If TDE is configured, the two options to manage encryption keys are either:

  • locally storing the keys in the cluster itself (that is, local storage)

  • or using a KMIP server

When local storage is used, the tool needs to query the cluster itself in order to retrieve the necessary encryption key or keys to work with the encrypted SSTable file or files on local disk. Because most of the DSE SSTable tools require stopping the local DSE process on the node before using a tool, the tool needs to connect to a different node in the cluster. You must provide the IP address of that different node. Do this by exporting a value for DSE_HOST environment variable. Example:

export DSE_HOST=12.34.56.789

If KMIP is used for key management, this step is not necessary.

Setting authentication credentials

In addition to TDE, you may also configure authentication. If TDE local storage is in use, and username/password authentication is in use, supply those credentials in addition to the DSE_HOST value. DSE internal authentication and LDAP are two scenarios where username/password may be required, depending on your requirements. Using Kerberos is an example of authentication not requiring username/password credentials.

If username/password are required, provide the values via the DSE_USERNAME and DSE_PASSWORD environment variables. Example:

export DSE_USERNAME=myusername
export DSE_PASSWORD=mysupersecretpassword

Alternatively, DSE retrieves the username and password values from the .dserc file, if present.

Additional considerations

When TDE local storage is in place, and the tool being used encounters an encrypted SSTable file, the tool uses the DSE Java driver to make a connection to a remote DSE node in the cluster, and submit a CQL query to retrieve the necessary encryption key. The tool (such as sstabledump, sstablerepairedset, sstableloader), must connect to a remote node because in most cases the local DSE process needs to be stopped before using any of the SSTable-related tools. These connection credentials are needed because the tool essentially becomes a client, via the driver, to the cluster.

DSE converts the three environment variables mentioned above by dse.in.sh shell script to Java system properties:

  • DSE_HOST becomes -Ddse.replicatedkeyprovider.client

  • DSE_USERNAME becomes -Ddse.replicatedkeyprovider.username

  • DSE_PASSWORD becomes -Ddse.replicatedkeyprovider.password

ReplicatedKeyProvider is the DSE Java class that consumes these values and handles retrieving the encryption key(s) from the cluster. DSE implements a log statement in this class that is helpful for debugging. It is logged at INFO level and available in the system.log file. Example log entry format:

"Checking for 'dse.replicatedkeyprovider' system properties, found hostname: %s; username: %s; [no|non-null] password"
Rekeying existing data Troubleshooting encryption key errors

General Inquiries: +1 (650) 389-6000 info@datastax.com

© DataStax | Privacy policy | Terms of use

DataStax, Titan, and TitanDB are registered trademarks of DataStax, Inc. and its subsidiaries in the United States and/or other countries.

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries.

Kubernetes is the registered trademark of the Linux Foundation.

landing_page landingpage