Represents a single row fetched from Cassandra.
Represents a single row fetched from Cassandra.
Offers getters to read individual fields by column name or column index.
The getters try to convert value to desired type, whenever possible.
Most of the column types can be converted to a String
.
For nullable columns, you should use the getXXXOption
getters which convert
null
s to None
values, otherwise a NullPointerException
would be thrown.
All getters throw an exception if column name/index is not found. Column indexes start at 0.
If the value cannot be converted to desired type, com.datastax.spark.connector.types.TypeConversionException is thrown.
Recommended getters for Cassandra types:
- ascii
: getString
, getStringOption
- bigint
: getLong
, getLongOption
- blob
: getBytes
, getBytesOption
- boolean
: getBool
, getBoolOption
- counter
: getLong
, getLongOption
- decimal
: getDecimal
, getDecimalOption
- double
: getDouble
, getDoubleOption
- float
: getFloat
, getFloatOption
- inet
: getInet
, getInetOption
- int
: getInt
, getIntOption
- text
: getString
, getStringOption
- timestamp
: getDate
, getDateOption
- timeuuid
: getUUID
, getUUIDOption
- uuid
: getUUID
, getUUIDOption
- varchar
: getString
, getStringOption
- varint
: getVarInt
, getVarIntOption
- list
: getList[T]
- set
: getSet[T]
- map
: getMap[K, V]
Collection getters getList
, getSet
and getMap
require to explicitly pass an appropriate item type:
row.getList[String]("a_list") row.getList[Int]("a_list") row.getMap[Int, String]("a_map")
Generic get
allows to automatically convert collections to other collection types.
Supported containers:
- scala.collection.immutable.List
- scala.collection.immutable.Set
- scala.collection.immutable.TreeSet
- scala.collection.immutable.Vector
- scala.collection.immutable.Map
- scala.collection.immutable.TreeMap
- scala.collection.Iterable
- scala.collection.IndexedSeq
- java.util.ArrayList
- java.util.HashSet
- java.util.HashMap
Example:
row.get[List[Int]]("a_list") row.get[Vector[Int]]("a_list") row.get[java.util.ArrayList[Int]]("a_list") row.get[TreeMap[Int, String]]("a_map")
Timestamps can be converted to other Date types by using generic get
. Supported date types:
- java.util.Date
- java.sql.Date
- org.joda.time.DateTime
All CassandraRows shared data
All CassandraRows shared data
row column names
column names from java driver row result set, without connector aliases.
cached java driver codecs to avoid registry lookups
Insert behaviors for Collections.
References a collection column by name with insert instructions
References a column by name.
Thrown when the requested column does not exist in the result set.
A column that can be selected from CQL results set by name.
Provides Cassandra-specific methods on org.apache.spark.sql.DataFrame
References a function call *
Provides Cassandra-specific methods on RDD
Provides Cassandra-specific methods on SparkContext
References TTL of a column.
References write time of a column.
References a row count value returned from SELECT count(*)
Contains a cql.CassandraConnector object which is used to connect to a Cassandra cluster and to send CQL statements to it.
Contains a cql.CassandraConnector object which is used to connect
to a Cassandra cluster and to send CQL statements to it. CassandraConnector
provides a Scala-idiomatic way of working with Cluster
and Session
object
and takes care of connection pooling and proper resource disposal.
Provides machinery for mapping Cassandra tables to user defined Scala classes or tuples.
Provides machinery for mapping Cassandra tables to user defined Scala classes or tuples. The main class in this package is mapper.ColumnMapper responsible for matching Scala object's properties with Cassandra column names.
Contains com.datastax.spark.connector.rdd.CassandraTableScanRDD class that is the main entry point for analyzing Cassandra data from Spark.
Offers type conversion magic, so you can receive Cassandra column values in a form you like the most.
Offers type conversion magic, so you can receive Cassandra column values in a form you like the most. Simply specify the type you want to use on the Scala side, and the column value will be converted automatically. Works also with complex objects like collections.
Useful stuff that didn't fit elsewhere.
Contains components for writing RDDs to Cassandra
The root package of Cassandra connector for Apache Spark. Offers handy implicit conversions that add Cassandra-specific methods to SparkContext and RDD.
Call cassandraTable method on the SparkContext object to create a CassandraRDD exposing Cassandra tables as Spark RDDs.
Call RDDFunctions
saveToCassandra
function on anyRDD
to save distributed collection to a Cassandra table.Example: