dse spark-jobserver

Starts and stops the Spark Jobserver


The default location of the spark-defaults.conf file depends on the type of installation:
Package installations /etc/dse/spark/spark-defaults.conf
Tarball installations installation_location/resources/spark/conf/spark-defaults.conf

Starts and stops the Spark Jobserver that is bundled with DSE.

Restriction: Command is supported only on nodes with analytics workloads.

See Spark Jobserver.


dse spark-jobserver start
[--properties-file path_to_properties_file] 
[--executor-memory memory] [--total-executor-cores cores]
[--conf name=spark.value] [--jars path_to_additional_jars]
[--help] [--verbose] | stop
Table 1. Legend
Syntax conventions Description
UPPERCASE Literal keyword.
Lowercase Not literal.
Italics Variable value. Replace with a valid option or user-defined value.
[ ] Optional. Square brackets ( [ ] ) surround optional command arguments. Do not type the square brackets.
( ) Group. Parentheses ( ( ) ) identify a group to choose from. Do not type the parentheses.
| Or. A vertical bar ( | ) separates alternative elements. Type any one of the elements. Do not type the vertical bar.
... Repeatable. An ellipsis ( ... ) indicates that you can repeat the syntax element as often as required.
'Literal string' Single quotation ( ' ) marks must surround literal strings in CQL statements. Use single quotation marks to preserve upper case.
{ key:value } Map collection. Braces ( { } ) enclose map collections or key value pairs. A colon separates the key and the value.
<datatype1,datatype2> Set, list, map, or tuple. Angle brackets ( < > ) enclose data types in a set, list, map, or tuple. Separate the data types with a comma.
cql_statement;End CQL statement. A semicolon ( ; ) terminates all CQL statements.
[ -- ] Separate the command line options from the command arguments with two hyphens ( -- ). This syntax is useful when arguments might be mistaken for command line options.
' <schema> ... </schema> ' Search CQL only: Single quotation marks ( ' ) surround an entire XML schema declaration.
@xml_entity='xml_entity_type' Search CQL only: Identify the entity and literal value to overwrite the XML element in the schema and solrconfig files.
Starts the Spark Jobserver.
Displays options and usage instructions. Use nodesync help subcommand for more information on a specific command.
Displays which arguments are recognized as Spark configuration options and which arguments are forwarded to the Spark shell.
Stops the Spark Jobserver.
For the dse spark-jobserver start command, apply one or more valid spark-submit options.
--properties-file path_to_properties_file
The location of the properties file that has the configuration settings. By default, Spark loads the settings from spark-defaults.conf.
--executor-memory mem
The amount of memory that each executor can consume for the application. Spark uses a 512 MB default. Specify the memory argument in JVM format using the k, m, or g suffix.
--total-executor-cores cores
The total number of cores the application uses.
--conf name=spark.value|sparkproperties.conf
An arbitrary Spark option to the Spark configuration prefixed by spark.
  • name-spark.value
  • sparkproperties.conf - a configuration
--jars path_to_additional_jars
A comma-separated list of paths to additional JAR files.


Start the Spark Jobserver without submit options

dse spark-jobserver start

Start the Spark Jobserver with submit option

dse spark-jobserver start --properties-file spark.conf

See spark-submit options.

Stop the Spark Jobserver

dse spark-jobserver stop