dse spark-history-server
Starts and stops the Spark history server.
Starts and stops the Spark history server, the front-end application that displays logging data from all nodes in the Spark cluster.
Restriction: Configuration is required for the Spark
history server. See Spark history
server.
Synopsis
dse spark-history-server start [--properties-file properties_file]|stop
Syntax conventions | Description |
---|---|
UPPERCASE | Literal keyword. |
Lowercase | Not literal. |
Italics |
Variable value. Replace with a valid option or user-defined value. |
[ ] |
Optional. Square brackets ( [ ] ) surround optional command
arguments. Do not type the square brackets. |
( ) |
Group. Parentheses ( ( ) ) identify a group to choose from. Do
not type the parentheses. |
| |
Or. A vertical bar ( | ) separates alternative elements. Type
any one of the elements. Do not type the vertical bar. |
... |
Repeatable. An ellipsis ( ... ) indicates that you can repeat
the syntax element as often as required. |
'Literal string' |
Single quotation ( ' ) marks must surround literal strings in
CQL statements. Use single quotation marks to preserve upper case. |
{ key:value } |
Map collection. Braces ( { } ) enclose map collections or key
value pairs. A colon separates the key and the value. |
<datatype1,datatype2> |
Set, list, map, or tuple. Angle brackets ( < > ) enclose
data types in a set, list, map, or tuple. Separate the data types with a comma.
|
cql_statement; |
End CQL statement. A semicolon ( ; ) terminates all CQL
statements. |
[ -- ] |
Separate the command line options from the command arguments with two hyphens (
-- ). This syntax is useful when arguments might be mistaken for
command line options. |
' <schema> ... </schema>
' |
Search CQL only: Single quotation marks ( ' ) surround an entire
XML schema declaration. |
@xml_entity='xml_entity_type' |
Search CQL only: Identify the entity and literal value to overwrite the XML element in the schema and solrconfig files. |
- start
- Starts the Spark history server to load the event logs from Spark jobs that were run with event logging enabled. The Spark history server can be started from any node in the cluster.
- --properties-file properties_file
- The properties file to overwrite the default Spark configuration in conf/spark-defaults.conf. The properties file can include settings like the authentication method and credentials and event log location.
- stop
- Stops the Spark history server.
Examples
Start the Spark history server on the local node
dse spark-history-server startThe Spark history server is started with the default configuration in conf/spark-defaults.conf.
Start the Spark history server with a properties file
dse spark-history-server start --properties-file sparkproperties.confThe Spark history server is started with the configuration specified in sparkproperties.conf.