Accessing the Spark session and context for applications running outside of DSE Analytics

You can optionally create session and context objects for applications that are run outside of the DSE Analytics environment.

You can optionally create session and context objects for applications that are run outside of the DSE Analytics environment. This is for advanced use cases where applications do not use dse spark-submit for handling the classpath and configuration settings.

All classpath and JAR distribution must be handled by the application. The application classpath must include the output of the dse spark-classpath command.

dse spark-classpath

Using the Builder API to create a DSE Spark session

To create a DSE Spark session outside of the DSE Analytics application environment, use the DseConfiguration class and the enableDseSupport method when creating a Spark session.

import org.apache.spark.sql.SparkSession
import com.datastax.spark.connector.DseConfiguration._
val spark = SparkSession.builder
.appName("Datastax Scala example")
.master("dse://127.0.0.1?")
.config("spark.jars", "target/scala-2.11/writeread_2.11-0.1.jar")
.enableHiveSupport()
.enableDseSupport()
.getOrCreate()

Creating a Spark Context

When creating a Spark Context object, use the DseConfiguration class and call the enableDseSupport method when creating the SparkConfiguration instance. In Scala:

import com.datastax.spark.connector.DseConfiguration._
new SparkConf().enableDseSupport()

In Java:

SparkConf rawConf = new SparkConf();
SparkConf conf = DseConfiguration.enableDseSupport(rawConf);