dse spark-jobserver

Starts and stops the Apache Spark™ Jobserver that is bundled with DataStax Enterprise (DSE).

This command is supported only on nodes with analytics workloads.

Synopsis

dse spark-jobserver start
[--properties-file <path_to_properties_file>]
[--executor-memory <memory>] [--total-executor-cores <cores>]
[--conf name=spark.<value>] [--jars `path_to_additional_jars`]
[--verbose] | stop
Syntax legend
Syntax conventions Description

Italic, bold, or < >

Syntax diagrams and code samples use one or more of these styles to mark placeholders for variable values. Replace placeholders with a valid option or your own user-defined value.

In CQL statements, angle brackets are required to enclose data types in a set, list, map, or tuple. Separate the data types with a comma. For example: <datatype2

In Search CQL statements, use angle brackets to identify the entity and literal value to overwrite the XML element in the schema and solrconfig files, such as @<xml_entity>='<xml_entity_type>'.

[ ]

Square brackets surround optional command arguments. Do not type the square brackets.

( )

Parentheses identify a group to choose from. Do not type the parentheses.

|

A pipe separates alternative elements. Type any one of the elements. Do not type the pipe.

...

Indicates that you can repeat the syntax element as often as required.

'

Use single quotation marks to surround literal strings in CQL statements. Use single quotation marks to preserve upper case. + For Search CQL only: Single quotation marks surround an entire XML schema declaration, such as '<<schema> ... </schema>>'

{ }

Map collection. Curly braces enclose maps ({ <key_datatype>:<value_datatype> }) or key value pairs ({ <key>:<value> }). A colon separates the key and the value.

;

Ends a CQL statement.

--

Separate command line options from command arguments with two hyphens. This syntax is useful when arguments might be mistaken for command line options.

start

Starts the Spark Jobserver.

--verbose

Displays which arguments are recognized as Spark configuration options and which arguments are forwarded to the Spark shell.

stop

Stops the Spark Jobserver.

For the dse spark-jobserver start command, apply one or more valid spark-submit options.

--properties-file path_to_properties_file

The location of the properties file that has the configuration settings. By default, Spark loads the settings from spark-defaults.conf.

--executor-memory mem

The amount of memory that each executor can consume for the application. Apache Spark uses a 512 MB default. Specify the memory argument in JVM format using the k, m, or g suffix.

--total-executor-cores

Specify the total number of cores the application uses.

--conf name=spark.value|sparkproperties.conf

An arbitrary Spark option to the Spark configuration prefixed by spark:

  • name-spark.value

  • sparkproperties.conf

--jars <path_to_additional_jars>

A comma-separated list of paths to additional JAR files.

Examples

Start the Apache Spark Jobserver with submit option

dse spark-jobserver start --properties-file spark.conf

Stop the Apache Spark Jobserver

dse spark-jobserver stop

Was this helpful?

Give Feedback

How can we improve the documentation?

© Copyright IBM Corporation 2025 | Privacy policy | Terms of use Manage Privacy Choices

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: Contact IBM