Generating Spark SQL schema files

Spark SQL can import schema files generated by DataStax Enterprise.

Procedure

  1. Export the schema file using dse client-tool.

    dse client-tool --use-server-config spark sql-schema --all > output.sql
  2. Copy the schema to an external Spark node.

    scp output.sql user@sparknode1.example.com:
  3. On a Spark node, import the schema using Spark.

    spark-sql --jars byos-5.1.jar --properties-file  byos.properties -f  output.sql

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com