dse spark-history-server

Starts and stops the Apache Spark™ history server, the front-end application that displays logging data from all nodes in the Spark cluster.

Restriction: Configuration is required for the Spark history server. See Apache Spark history server.

Synopsis

dse spark-history-server
start [--properties-file <properties_file>]|stop
Syntax legend
Syntax conventions Description

Italic, bold, or < >

Syntax diagrams and code samples use one or more of these styles to mark placeholders for variable values. Replace placeholders with a valid option or your own user-defined value.

In CQL statements, angle brackets are required to enclose data types in a set, list, map, or tuple. Separate the data types with a comma. For example: <datatype2

In Search CQL statements, angle brackets are used to identify the entity and literal value to overwrite the XML element in the schema and solrconfig files, such as @<xml_entity>='<xml_entity_type>'.

[ ]

Square brackets surround optional command arguments. Do not type the square brackets.

( )

Parentheses identify a group to choose from. Do not type the parentheses.

|

A pipe separates alternative elements. Type any one of the elements. Do not type the pipe.

...

Indicates that you can repeat the syntax element as often as required.

'

Single quotation marks must surround literal strings in CQL statements. Use single quotation marks to preserve upper case. + For Search CQL only: Single quotation marks surround an entire XML schema declaration, such as '<<schema> ... </schema>>'

{ }

Map collection. Curly braces enclose maps ({ <key_datatype>:<value_datatype> }) or key value pairs ({ <key>:<value> }). A colon separates the key and the value.

;

Ends a CQL statement.

--

Separate command line options from command arguments with two hyphens. This syntax is useful when arguments might be mistaken for command line options.

start

Starts the Spark history server to load the event logs from Spark jobs that were run with event logging enabled. The Spark history server can be started from any node in the cluster.

--properties-file properties_file

The properties file to overwrite the default Spark configuration in conf/spark-defaults.conf. The properties file can include settings like the authentication method and credentials and event log location.

stop

Stops the Spark history server.

Examples

Start the Apache Spark history server on the local node

dse spark-history-server start

The Spark history server is started with the default configuration in conf/spark-defaults.conf.

Start the Apache Spark history server with a properties file

dse spark-history-server start --properties-file sparkproperties.conf

The Spark history server is started with the configuration specified in sparkproperties.conf.

Was this helpful?

Give Feedback

How can we improve the documentation?

© Copyright IBM Corporation 2025 | Privacy policy | Terms of use Manage Privacy Choices

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: Contact IBM