dse client-tool configuration byos-export

Exports the DSE node configuration to a Spark-compatible file that can be copied to a node in the external Spark cluster and used with the Spark shell.

Synopsis

$ dse  client-tool connection_options configuration byos-export
[--default-properties path_to_existing_properties_file]
[--export-credentials]
[--generate-token [--token-renewer username]]
Syntax conventions
Syntax conventions Description

UPPERCASE

Literal keyword.

Lowercase

Not literal.

Italics

Variable value. Replace with a valid option or user-defined value.

[ ]

Optional. Square brackets ( [ ] ) surround optional command arguments. Do not type the square brackets.

( )

Group. Parentheses ( ( ) ) identify a group to choose from. Do not type the parentheses.

|

Or. A vertical bar ( | ) separates alternative elements. Type any one of the elements. Do not type the vertical bar.

...

Repeatable. An ellipsis ( ... ) indicates that you can repeat the syntax element as often as required.

'Literal string'

Single quotation ( ' ) marks must surround literal strings in CQL statements. Use single quotation marks to preserve upper case.

{ key:value }

Map collection. Braces ( { } ) enclose map collections or key value pairs. A colon separates the key and the value.

<datatype1,datatype2>

Set, list, map, or tuple. Angle brackets ( < > ) enclose data types in a set, list, map, or tuple. Separate the data types with a comma.

cql_statement;

End CQL statement. A semicolon ( ; ) terminates all CQL statements.

[ -- ]

Separate the command line options from the command arguments with two hyphens ( -- ). This syntax is useful when arguments might be mistaken for command line options.

' <schema> …​ </schema> '

Search CQL only: Single quotation marks ( ' ) surround an entire XML schema declaration.

@xml_entity='xml_entity_type'

Search CQL only: Identify the entity and literal value to overwrite the XML element in the schema and solrconfig files.

--default-properties spark_propfile_path dse_spark_propfile_path

The path to the default Spark properties file and the DataStax Enterprise Spark properties file to merge properties from both.

--export-credentials

Store current DSE user and password in the generated configuration file.

file

The file name for the generated Spark-compatible file. For example, byos.properties.

--generate-token

Generates digest authentication token to support access to DSE clusters secured with Kerberos from non-Kerberos clusters.

--set-keystore-password password

The keystore password for connection to the database when SSL client authentication is enabled.

--set-keystore-path path

The path to the SSL keystore when SSL client authentication is enabled. All nodes must store the keystore in the same location.

--set-keystore-type type

The keystore type when SSL client authentication is enabled. If not specified, the default is JKS.

--set-truststore-password password

Include the specified truststore password in the configuration file.

--set-truststore-path path

Path to SSL truststore on Spark nodes. All nodes must store the truststore in the same location.

--set-truststore-type type

The truststore type when SSL client authentication is enabled. If not specified, the default is JKS.

--token-renewer userid

User with permission to renew or cancel the token. When not specified, only the DSE process can renew the generated token.

Examples

You can export the DSE node configuration to a Spark-compatible file with various options.

Generate the byos.properties file in your home directory

$ dse client-tool configuration byos-export ~/byos.properties

Merge the default Spark properties with the DSE Spark properties

$ dse client-tool configuration byos-export --default-properties /usr/lib/spark/conf/spark-defaults.conf /home/user1/.dse/byos.conf

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com