dse spark-jobserver

Starts and stops the Spark Jobserver that is bundled with DSE.

Restriction: Command is supported only on nodes with analytics workloads.


dse spark-jobserver start
[--properties-file <path_to_properties_file>]
[--executor-memory <memory>] [--total-executor-cores <cores>]
[--conf name=spark.<value>] [--jars `path_to_additional_jars`]
[--verbose] | stop
Syntax conventions Description


Literal keyword.


Not literal.


Variable value. Replace with a valid option or user-defined value.

[ ]

Optional. Square brackets ( [ ] ) surround optional command arguments. Do not type the square brackets.

( )

Group. Parentheses ( ( ) ) identify a group to choose from. Do not type the parentheses.


Or. A vertical bar ( | ) separates alternative elements. Type any one of the elements. Do not type the vertical bar.


Repeatable. An ellipsis ( ... ) indicates that you can repeat the syntax element as often as required.

'<Literal string>'

Single quotation ( ' ) marks must surround literal strings in CQL statements. Use single quotation marks to preserve upper case.

{ <key>:<value> }

Map collection. Braces ( { } ) enclose map collections or key value pairs. A colon separates the key and the value.


Set, list, map, or tuple. Angle brackets ( < > ) enclose data types in a set, list, map, or tuple. Separate the data types with a comma.


End CQL statement. A semicolon ( ; ) terminates all CQL statements.

[ -- ]

Separate the command line options from the command arguments with two hyphens ( -- ). This syntax is useful when arguments might be mistaken for command line options.

' <<schema> ... </schema> >'

Search CQL only: Single quotation marks ( ' ) surround an entire XML schema declaration.


Search CQL only: Identify the entity and literal value to overwrite the XML element in the schema and solrconfig files.


Starts the Spark Jobserver.


Displays which arguments are recognized as Spark configuration options and which arguments are forwarded to the Spark shell.


Stops the Spark Jobserver.

For the dse spark-jobserver start command, apply one or more valid spark-submit options.

--properties-file path_to_properties_file

The location of the properties file that has the configuration settings. By default, Spark loads the settings from spark-defaults.conf.

--executor-memory mem

The amount of memory that each executor can consume for the application. Spark uses a 512 MB default. Specify the memory argument in JVM format using the k, m, or g suffix.`

--total-executor-cores cores

The total number of cores the application uses.

--conf name=spark.value|sparkproperties.conf

An arbitrary Spark option to the Spark configuration prefixed by spark.

  • name-spark.value

  • sparkproperties.conf - a configuration

--jars path_to_additional_jars

A comma-separated list of paths to additional JAR files.


Start the Spark Jobserver with submit option

dse spark-jobserver start --properties-file spark.conf

Stop the Spark Jobserver

dse spark-jobserver stop

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com