dse spark
Enters interactive Spark shell and offers basic auto-completion.
Restriction: Command is supported only on nodes with analytics workloads.
For details on using Spark with DSE, see:
Synopsis
dse connection_options spark
[-framework dse|spark-2.0] [--help] [--verbose]
[--conf name=spark.value|sparkproperties.conf]
[--executor-memory mem]
[--jars additional-jars]
[--master dse://?appReconnectionTimeoutSeconds=secs]
[--properties-file path_to_properties_file]
[--total-executor-cores cores]
[-i app_script_file]
Syntax conventions
Syntax conventions | Description |
---|---|
UPPERCASE |
Literal keyword. |
Lowercase |
Not literal. |
|
Variable value. Replace with a valid option or user-defined value. |
|
Optional.
Square brackets ( |
|
Group.
Parentheses ( |
|
Or.
A vertical bar ( |
|
Repeatable.
An ellipsis ( |
|
Single quotation ( |
|
Map collection.
Braces ( |
|
Set, list, map, or tuple.
Angle brackets ( |
|
End CQL statement.
A semicolon ( |
|
Separate the command line options from the command arguments with two hyphens ( |
|
Search CQL only: Single quotation marks ( |
|
Search CQL only: Identify the entity and literal value to overwrite the XML element in the schema and solrconfig files. |
In general, Spark submission arguments (--submission_args
) are translated into system properties -Dname=value
and other VM parameters like classpath.
The application arguments (-app_args
) are passed directly to the application.
Configure the Spark shell with these arguments:
--conf name=spark.value|sparkproperties.conf
-
An arbitrary Spark option to the Spark configuration prefixed by spark.
-
name-spark.value
-
sparkproperties.conf - a configuration
-
--executor-memory mem
-
The amount of memory that each executor can consume for the application. Spark uses a 512 MB default. Specify the memory argument in JVM format using the
k
,m
, org
suffix.
-framework dse
|spark-2.0
-
The classpath for the Spark shell. When not set, the default is
dse
.-
dse
- Sets the Spark classpath to the same classpath that is used by the DSE server. -
spark-2.0
- Sets a classpath that is used by the open source Spark (OSS) 2.0 release to accommodate applications originally written for open source Apache Spark. Uses a BYOS (Bring Your Own Spark) JAR with shaded references to internal dependencies to eliminate complexity when porting an app from OSS Spark.If the code works on DSE, applications do not require the
spark-2.0
framework. Full support in thespark-2.0
framework might require specifying additional dependencies. For example:hadoop-aws
is included on the dse server path but is not present on the OSS Spark-2.0 classpath. In this example, applications that use S3 or other AWS APIs must include their ownaws-sdk
on the runtime classpath. This additional runtime classpath is required only for applications that cannot run on the DSE classpath.
-
--help
-
Shows a help message that displays all options except DataStax Enterprise Spark shell options.
-i app_script_file
-
Spark shell application argument that runs a script from the specified file.
--jars path_to_additional_jars
-
A comma-separated list of paths to additional JAR files.
--master dse://?appReconnectionTimeoutSeconds=secs
-
A custom timeout value when submitting the application, useful for troubleshooting Spark application failures. The default timeout value is 5 seconds.
--properties-file path_to_properties_file
-
The location of the properties file that has the configuration settings. By default, Spark loads the settings from
spark-defaults.conf
. --total-executor-cores cores
-
The total number of cores the application uses.
--verbose
-
Displays which arguments are recognized as Spark configuration options and which arguments are forwarded to the Spark shell.
Examples
Start the Spark shell
dse spark
Start the Spark shell with case-sensitivity
DseGraphFrame
and Spark SQL are case insensitive by default.
Column names that differ only in case result in conflicts.
The Spark property spark.sql.caseSensitive=true
avoids case conflicts.
dse spark --conf spark.sql.caseSensitive=true
Set the timeout value to 10 seconds
dse spark --master dse://?appReconnectionTimeoutSeconds=10
Useful for troubleshooting, see Detecting Spark application failures.