Class/Object

com.datastax.spark.connector.rdd

CassandraRDD

Related Docs: object CassandraRDD | package rdd

Permalink

abstract class CassandraRDD[R] extends RDD[R]

Linear Supertypes
RDD[R], Logging, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. CassandraRDD
  2. RDD
  3. Logging
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraRDD(sc: SparkContext, dep: Seq[Dependency[_]])(implicit arg0: ClassTag[R])

    Permalink

Type Members

  1. abstract type Self <: CassandraRDD[R]

    Permalink

    This is slightly different than Scala this.type.

    This is slightly different than Scala this.type. this.type is the unique singleton type of an object which is not compatible with other instances of the same type, so returning anything other than this is not really possible without lying to the compiler by explicit casts. Here SelfType is used to return a copy of the object - a different instance of the same type

Abstract Value Members

  1. abstract def cassandraCount(): Long

    Permalink

    Counts the number of items in this RDD by selecting count(*) on Cassandra table

  2. abstract def clusteringOrder: Option[ClusteringOrder]

    Permalink
    Attributes
    protected
  3. abstract def columnNames: ColumnSelector

    Permalink
    Attributes
    protected
  4. abstract def compute(split: Partition, context: TaskContext): Iterator[R]

    Permalink
    Definition Classes
    RDD
    Annotations
    @DeveloperApi()
  5. abstract def connector: CassandraConnector

    Permalink
    Attributes
    protected
  6. abstract def copy(columnNames: ColumnSelector = columnNames, where: CqlWhereClause = where, limit: Option[CassandraLimit] = limit, clusteringOrder: Option[ClusteringOrder] = None, readConf: ReadConf = readConf, connector: CassandraConnector = connector): Self

    Permalink

    Allows to copy this RDD with changing some of the properties

    Allows to copy this RDD with changing some of the properties

    Attributes
    protected
  7. abstract def getPartitions: Array[Partition]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  8. abstract def keyspaceName: String

    Permalink
    Attributes
    protected[com.datastax.spark.connector]
  9. abstract def limit: Option[CassandraLimit]

    Permalink
    Attributes
    protected
  10. abstract def narrowColumnSelection(columns: Seq[ColumnRef]): Seq[ColumnRef]

    Permalink
    Attributes
    protected
  11. abstract def readConf: ReadConf

    Permalink
    Attributes
    protected
  12. abstract val selectedColumnRefs: Seq[ColumnRef]

    Permalink
  13. abstract def tableName: String

    Permalink
    Attributes
    protected[com.datastax.spark.connector]
  14. abstract def toEmptyCassandraRDD: EmptyCassandraRDD[R]

    Permalink
  15. abstract def where: CqlWhereClause

    Permalink
    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. def ++(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  4. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  5. def aggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): U

    Permalink
    Definition Classes
    RDD
  6. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10], arg12: TypeConverter[A11]): CassandraRDD[B]

    Permalink
  7. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10]): CassandraRDD[B]

    Permalink
  8. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9]): CassandraRDD[B]

    Permalink
  9. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8]): CassandraRDD[B]

    Permalink
  10. def as[B, A0, A1, A2, A3, A4, A5, A6, A7](f: (A0, A1, A2, A3, A4, A5, A6, A7) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7]): CassandraRDD[B]

    Permalink
  11. def as[B, A0, A1, A2, A3, A4, A5, A6](f: (A0, A1, A2, A3, A4, A5, A6) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6]): CassandraRDD[B]

    Permalink
  12. def as[B, A0, A1, A2, A3, A4, A5](f: (A0, A1, A2, A3, A4, A5) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5]): CassandraRDD[B]

    Permalink
  13. def as[B, A0, A1, A2, A3, A4](f: (A0, A1, A2, A3, A4) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4]): CassandraRDD[B]

    Permalink
  14. def as[B, A0, A1, A2, A3](f: (A0, A1, A2, A3) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3]): CassandraRDD[B]

    Permalink
  15. def as[B, A0, A1, A2](f: (A0, A1, A2) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2]): CassandraRDD[B]

    Permalink
  16. def as[B, A0, A1](f: (A0, A1) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1]): CassandraRDD[B]

    Permalink
  17. def as[B, A0](f: (A0) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0]): CassandraRDD[B]

    Permalink

    Maps each row into object of a different type using provided function taking column value(s) as argument(s).

    Maps each row into object of a different type using provided function taking column value(s) as argument(s). Can be used to convert each row to a tuple or a case class object:

    sc.cassandraTable("ks", "table")
      .select("column1")
      .as((s: String) => s)                 // yields CassandraRDD[String]
    
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as((_: String, _: Long))             // yields CassandraRDD[(String, Long)]
    
    case class MyRow(key: String, value: Long)
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as(MyRow)                            // yields CassandraRDD[MyRow]
  18. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  19. def cache(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  20. def cartesian[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]

    Permalink
    Definition Classes
    RDD
  21. def checkpoint(): Unit

    Permalink
    Definition Classes
    RDD
  22. def clearDependencies(): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  23. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. def clusteringOrder(order: ClusteringOrder): Self

    Permalink

    Adds a CQL ORDER BY clause to the query.

    Adds a CQL ORDER BY clause to the query. It can be applied only in case there are clustering columns and primary key predicate is pushed down in where. It is useful when the default direction of ordering rows within a single Cassandra partition needs to be changed.

  25. def coalesce(numPartitions: Int, shuffle: Boolean, partitionCoalescer: Option[PartitionCoalescer])(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  26. def collect[U](f: PartialFunction[R, U])(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  27. def collect(): Array[R]

    Permalink
    Definition Classes
    RDD
  28. def context: SparkContext

    Permalink
    Definition Classes
    RDD
  29. def convertTo[B](implicit arg0: ClassTag[B], arg1: RowReaderFactory[B]): CassandraRDD[B]

    Permalink
    Attributes
    protected
  30. def count(): Long

    Permalink
    Definition Classes
    RDD
  31. def countApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Definition Classes
    RDD
  32. def countApproxDistinct(relativeSD: Double): Long

    Permalink
    Definition Classes
    RDD
  33. def countApproxDistinct(p: Int, sp: Int): Long

    Permalink
    Definition Classes
    RDD
  34. def countByValue()(implicit ord: Ordering[R]): Map[R, Long]

    Permalink
    Definition Classes
    RDD
  35. def countByValueApprox(timeout: Long, confidence: Double)(implicit ord: Ordering[R]): PartialResult[Map[R, BoundedDouble]]

    Permalink
    Definition Classes
    RDD
  36. final def dependencies: Seq[Dependency[_]]

    Permalink
    Definition Classes
    RDD
  37. def distinct(): RDD[R]

    Permalink
    Definition Classes
    RDD
  38. def distinct(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  39. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  40. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  41. def filter(f: (R) ⇒ Boolean): RDD[R]

    Permalink
    Definition Classes
    RDD
  42. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  43. def first(): R

    Permalink
    Definition Classes
    RDD
  44. def firstParent[U](implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  45. def flatMap[U](f: (R) ⇒ TraversableOnce[U])(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  46. def fold(zeroValue: R)(op: (R, R) ⇒ R): R

    Permalink
    Definition Classes
    RDD
  47. def foreach(f: (R) ⇒ Unit): Unit

    Permalink
    Definition Classes
    RDD
  48. def foreachPartition(f: (Iterator[R]) ⇒ Unit): Unit

    Permalink
    Definition Classes
    RDD
  49. def getCheckpointFile: Option[String]

    Permalink
    Definition Classes
    RDD
  50. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  51. def getDependencies: Seq[Dependency[_]]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  52. final def getNumPartitions: Int

    Permalink
    Definition Classes
    RDD
    Annotations
    @Since( "1.6.0" )
  53. def getOutputDeterministicLevel: org.apache.spark.rdd.DeterministicLevel.Value

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
    Annotations
    @DeveloperApi()
  54. def getPreferredLocations(split: Partition): Seq[String]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  55. def getStorageLevel: StorageLevel

    Permalink
    Definition Classes
    RDD
  56. def glom(): RDD[Array[R]]

    Permalink
    Definition Classes
    RDD
  57. def groupBy[K](f: (R) ⇒ K, p: Partitioner)(implicit kt: ClassTag[K], ord: Ordering[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  58. def groupBy[K](f: (R) ⇒ K, numPartitions: Int)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  59. def groupBy[K](f: (R) ⇒ K)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  60. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  61. val id: Int

    Permalink
    Definition Classes
    RDD
  62. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  63. def intersection(other: RDD[R], numPartitions: Int): RDD[R]

    Permalink
    Definition Classes
    RDD
  64. def intersection(other: RDD[R], partitioner: Partitioner)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  65. def intersection(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  66. def isCheckpointed: Boolean

    Permalink
    Definition Classes
    RDD
  67. def isEmpty(): Boolean

    Permalink
    Definition Classes
    RDD
  68. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  69. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  70. final def iterator(split: Partition, context: TaskContext): Iterator[R]

    Permalink
    Definition Classes
    RDD
  71. def keyBy[K](f: (R) ⇒ K): RDD[(K, R)]

    Permalink
    Definition Classes
    RDD
  72. def limit(rowLimit: Long): Self

    Permalink

    Adds the limit clause to CQL select statement.

    Adds the limit clause to CQL select statement. The limit will be applied for each created Spark partition. In other words, unless the data are fetched from a single Cassandra partition the number of results is unpredictable.

    The main purpose of passing limit clause is to fetch top n rows from a single Cassandra partition when the table is designed so that it uses clustering keys and a partition key predicate is passed to the where clause.

  73. def localCheckpoint(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  74. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  75. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  76. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  77. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  78. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  79. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  80. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  81. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  82. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  83. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  84. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  85. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  86. def map[U](f: (R) ⇒ U)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  87. def mapPartitions[U](f: (Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  88. def mapPartitionsWithIndex[U](f: (Int, Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  89. def max()(implicit ord: Ordering[R]): R

    Permalink
    Definition Classes
    RDD
  90. def min()(implicit ord: Ordering[R]): R

    Permalink
    Definition Classes
    RDD
  91. var name: String

    Permalink
    Definition Classes
    RDD
  92. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  93. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  94. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  95. def parent[U](j: Int)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  96. val partitioner: Option[Partitioner]

    Permalink
    Definition Classes
    RDD
  97. final def partitions: Array[Partition]

    Permalink
    Definition Classes
    RDD
  98. def perPartitionLimit(rowLimit: Long): Self

    Permalink

    Adds the PER PARTITION LIMIT clause to CQL select statement.

    Adds the PER PARTITION LIMIT clause to CQL select statement. The limit will be applied for every Cassandra Partition. Only Valid For Cassandra 3.6+

  99. def persist(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  100. def persist(newLevel: StorageLevel): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  101. def pipe(command: Seq[String], env: Map[String, String], printPipeContext: ((String) ⇒ Unit) ⇒ Unit, printRDDElement: (R, (String) ⇒ Unit) ⇒ Unit, separateWorkingDir: Boolean, bufferSize: Int, encoding: String): RDD[String]

    Permalink
    Definition Classes
    RDD
  102. def pipe(command: String, env: Map[String, String]): RDD[String]

    Permalink
    Definition Classes
    RDD
  103. def pipe(command: String): RDD[String]

    Permalink
    Definition Classes
    RDD
  104. final def preferredLocations(split: Partition): Seq[String]

    Permalink
    Definition Classes
    RDD
  105. def randomSplit(weights: Array[Double], seed: Long): Array[RDD[R]]

    Permalink
    Definition Classes
    RDD
  106. def reduce(f: (R, R) ⇒ R): R

    Permalink
    Definition Classes
    RDD
  107. def repartition(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  108. def sample(withReplacement: Boolean, fraction: Double, seed: Long): RDD[R]

    Permalink
    Definition Classes
    RDD
  109. def saveAsObjectFile(path: String): Unit

    Permalink
    Definition Classes
    RDD
  110. def saveAsTextFile(path: String, codec: Class[_ <: CompressionCodec]): Unit

    Permalink
    Definition Classes
    RDD
  111. def saveAsTextFile(path: String): Unit

    Permalink
    Definition Classes
    RDD
  112. def select(columns: ColumnRef*): Self

    Permalink

    Narrows down the selected set of columns.

    Narrows down the selected set of columns. Use this for better performance, when you don't need all the columns in the result RDD. When called multiple times, it selects the subset of the already selected columns, so after a column was removed by the previous select call, it is not possible to add it back.

    The selected columns are ColumnRef instances. This type allows to specify columns for straightforward retrieval and to read TTL or write time of regular columns as well. Implicit conversions included in com.datastax.spark.connector package make it possible to provide just column names (which is also backward compatible) and optional add .ttl or .writeTime suffix in order to create an appropriate ColumnRef instance.

  113. def selectedColumnNames: Seq[String]

    Permalink
  114. def setName(_name: String): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  115. def sortBy[K](f: (R) ⇒ K, ascending: Boolean, numPartitions: Int)(implicit ord: Ordering[K], ctag: ClassTag[K]): RDD[R]

    Permalink
    Definition Classes
    RDD
  116. def sparkContext: SparkContext

    Permalink
    Definition Classes
    RDD
  117. def subtract(other: RDD[R], p: Partitioner)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  118. def subtract(other: RDD[R], numPartitions: Int): RDD[R]

    Permalink
    Definition Classes
    RDD
  119. def subtract(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  120. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  121. def take(num: Int): Array[R]

    Permalink
    Definition Classes
    CassandraRDD → RDD
  122. def takeOrdered(num: Int)(implicit ord: Ordering[R]): Array[R]

    Permalink
    Definition Classes
    RDD
  123. def takeSample(withReplacement: Boolean, num: Int, seed: Long): Array[R]

    Permalink
    Definition Classes
    RDD
  124. def toDebugString: String

    Permalink
    Definition Classes
    RDD
  125. def toJavaRDD(): JavaRDD[R]

    Permalink
    Definition Classes
    RDD
  126. def toLocalIterator: Iterator[R]

    Permalink
    Definition Classes
    RDD
  127. def toString(): String

    Permalink
    Definition Classes
    RDD → AnyRef → Any
  128. def top(num: Int)(implicit ord: Ordering[R]): Array[R]

    Permalink
    Definition Classes
    RDD
  129. def treeAggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U, depth: Int)(implicit arg0: ClassTag[U]): U

    Permalink
    Definition Classes
    RDD
  130. def treeReduce(f: (R, R) ⇒ R, depth: Int): R

    Permalink
    Definition Classes
    RDD
  131. def union(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  132. def unpersist(blocking: Boolean): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  133. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  134. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  135. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  136. def where(cql: String, values: Any*): Self

    Permalink

    Adds a CQL WHERE predicate(s) to the query.

    Adds a CQL WHERE predicate(s) to the query. Useful for leveraging secondary indexes in Cassandra. Implicitly adds an ALLOW FILTERING clause to the WHERE clause, however beware that some predicates might be rejected by Cassandra, particularly in cases when they filter on an unindexed, non-clustering column.

  137. def withAscOrder: Self

    Permalink
  138. def withConnector(connector: CassandraConnector): Self

    Permalink

    Returns a copy of this Cassandra RDD with specified connector

  139. def withDescOrder: Self

    Permalink
  140. def withReadConf(readConf: ReadConf): Self

    Permalink

    Allows to set custom read configuration, e.g.

    Allows to set custom read configuration, e.g. consistency level or fetch size.

  141. def zip[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]

    Permalink
    Definition Classes
    RDD
  142. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D])(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  143. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  144. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C])(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  145. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  146. def zipPartitions[B, V](rdd2: RDD[B])(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  147. def zipPartitions[B, V](rdd2: RDD[B], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  148. def zipWithIndex(): RDD[(R, Long)]

    Permalink
    Definition Classes
    RDD
  149. def zipWithUniqueId(): RDD[(R, Long)]

    Permalink
    Definition Classes
    RDD

Inherited from RDD[R]

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped