Monitoring Spark with the web interface

A Spark web interface is bundled with DataStax Enterprise. The Spark web interface facilitates monitoring, debugging, and managing Spark.

spark-env.sh

The default location of the spark-env.sh file depends on the type of installation:

Package installations
Installer-Services installations

/etc/dse/spark/spark-env.sh

Tarball installations
Installer-No Services installations

installation_location/resources/spark/conf/spark-env.sh

spark-defaults.conf

The default location of the spark-defaults.conf file depends on the type of installation:

Package installations
Installer-Services installations

/etc/dse/spark/spark-defaults.conf

Tarball installations
Installer-No Services installations

installation_location/resources/spark/conf/spark-defaults.conf

spark-daemon-defaults.conf

The default location of the spark-daemon-defaults.conf file depends on the type of installation:

Package installations
Installer-Services installations

/etc/dse/spark/spark-daemon-defaults.conf

Tarball installations
Installer-No Services installations

installation_location/resources/spark/conf/spark-daemon-defaults.conf

A web interface, bundled with DataStax Enterprise, facilitates monitoring, debugging, and managing Spark.

Using the Spark web interface

To use the Spark web interface:
  • Enter the listen IP address of any Spark node in a browser followed by port number 7080. Starting in DSE 5.1, all Spark nodes within an Analytics datacenter will redirect to the current Spark Master.
  • To change the port, modify the spark-env.sh configuration file. If you change the port number, set it to the same port number on every node in the datacenter.

If the Spark Master is not available, the UI will keep polling for the Spark Master every 10 seconds until the Master is available.

The Spark web interface can be secured using SSL. SSL encryption of the web interface is enabled by default when client encryption is enabled.

If authentication is enabled, and plain authentication is available, you will be prompted for authentication credentials when accessing the web UI. We recommend using SSL with authentication.

Note: Kerberos authentication is not supported in the Spark web UI. If authentication is enabled and either LDAP or Internal authentication is not available, the Spark web UI will not be accessible. If this occurs, disable authentication for the Spark web UI only by removing the spark.ui.filters setting in spark-daemon-defaults.conf located in the Spark configuration directory.

DSE SSL encryption and authentication only apply to the Spark Master and Worker UIs, not the Spark Driver UI. To use encryption and authentication with the Driver UI, refer to the Spark security documentation.

Authorization is not supported in the Spark web UI. Any authenticated user can monitor and control any Spark applications within the UI.



See the Spark documentation for information on using the Spark web UI.

Displaying fully qualified domain names in the web UI

To display fully qualified domain names (FQDNs) in the Spark web UI, set the SPARK_PUBLIC_DNS variable in spark-env.sh on each Analytics node.

Set SPARK_PUBLIC_DNS to the FQDN of the node if you have SSL enabled for the web UI.

Redirecting to the fully qualified domain name of the master

Set the SPARK_LOCAL_IP or SPARK_LOCAL_HOSTNAME in the spark-env.sh file on each node to the fully qualified domain name (FQDN) of the node to force any redirects to the web UI using the FQDN of the Spark master. This is useful when enabling SSL in the web UI.

export SPARK_LOCAL_HOSTNAME=FQDN of the node

Filtering properties in the Spark Driver UI

The Spark Driver UI has an Environment tab that lists the Spark configuration and system properties used by Spark. This can include sensitive information like passwords and security tokens. DSE Spark filters these properties and mask their values with sequences of asterisks. The spark.ui.confidentialKeys filter is configured as a comma separated list of regular expressions that by default includes all properties that contain the string "token" or "password". To modify the filter, edit the spark.ui.confidentialKeys property in spark-defaults.conf in the Spark configuration directory.