Monitoring Spark with the web interface
A Spark web interface is bundled with DataStax Enterprise. The Spark web interface facilitates monitoring, debugging, and managing Spark.
spark-daemon-defaults.conf
The default location of the spark-daemon-defaults.conf file depends on the type of installation:Package installations | /etc/dse/spark/spark-daemon-defaults.conf |
Tarball installations | installation_location/resources/spark/conf/spark-daemon-defaults.conf |
spark-env.sh
The default location of the spark-env.sh file depends on the type of installation:Package installations | /etc/dse/spark/spark-env.sh |
Tarball installations | installation_location/resources/spark/conf/spark-env.sh |
spark-defaults.conf
The default location of the spark-defaults.conf file depends on the type of installation:Package installations | /etc/dse/spark/spark-defaults.conf |
Tarball installations | installation_location/resources/spark/conf/spark-defaults.conf |
A web interface, bundled with DataStax Enterprise, facilitates monitoring, debugging, and managing Spark.
Using the Spark web interface
To use the Spark web interface enter the listen IP address of any Spark node in a browser followed by port number 7080 (configured in the spark-env.sh configuration file). Starting in DSE 5.1, all Spark nodes within an Analytics datacenter will redirect to the current Spark Master.
If the Spark Master is not available, the UI will keep polling for the Spark Master every 10 seconds until the Master is available.
The Spark web interface can be secured using SSL. SSL encryption of the web interface is enabled by default when client encryption is enabled.
If authentication is enabled, and plain authentication is available, you will be prompted for authentication credentials when accessing the web UI. We recommend using SSL with authentication.
spark.ui.filters
setting in
spark-daemon-defaults.conf located in the Spark
configuration directory. DSE SSL encryption and authentication only apply to the Spark Master and Worker UIs, not the Spark Driver UI. To use encryption and authentication with the Driver UI, refer to the Spark security documentation.
The UI includes information on the number of cores and amount of memory available to Spark in total and in each work pool, and similar information for each Spark worker. The applications list the associated work pool.
See the Spark documentation for information on using the Spark web UI.
Authorization in the Spark web UI
When authorization is enabled and an authenticated user accesses the web UI, what they can see and do is controlled by their permissions. This allows administrators to control who has permission to view specific application logs, view the executors for the application, kill the application, and list all applications. Viewing and modifying applications can be configured per datacenter, work pool, or application.
See Using authorization with Spark for details on granting permissions.
Displaying fully qualified domain names in the web UI
To display fully qualified domain names (FQDNs) in the Spark web UI, set the
SPARK_PUBLIC_DNS
variable in
spark-env.sh on each Analytics node.
Set SPARK_PUBLIC_DNS
to the FQDN of the node if you have SSL enabled for
the web UI.
Redirecting to the fully qualified domain name of the master
Set the SPARK_LOCAL_IP
or SPARK_LOCAL_HOSTNAME
in the
spark-env.sh file on each node to the fully qualified
domain name (FQDN) of the node to force any redirects to the web UI using the FQDN of the
Spark master. This is useful when enabling
SSL in the web UI.
export SPARK_LOCAL_HOSTNAME=FQDN of the node
Filtering properties in the Spark Driver UI
The Spark Driver UI has an Environment tab that lists the Spark configuration and system
properties used by Spark. This can include sensitive information like passwords and security
tokens. DSE Spark filters these properties and mask their values with sequences of
asterisks. The spark.redaction.regex
filter is configured as a regular
expression that by default includes all properties that contain the string "secret",
"token", or "password" as well as all system properties. To modify the filter, edit the
spark.redaction.regex
property in
spark-defaults.conf in the Spark configuration
directory.