Class

com.datastax.spark.connector.streaming

DStreamFunctions

Related Doc: package streaming

Permalink

class DStreamFunctions[T] extends WritableToCassandra[T] with Serializable with Logging

Linear Supertypes
Logging, Serializable, Serializable, WritableToCassandra[T], AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DStreamFunctions
  2. Logging
  3. Serializable
  4. Serializable
  5. WritableToCassandra
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DStreamFunctions(dstream: DStream[T])

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def conf: SparkConf

    Permalink
  7. def deleteFromCassandra(keyspaceName: String, tableName: String, deleteColumns: ColumnSelector = SomeColumns(), keyColumns: ColumnSelector = PrimaryKeyColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T]): Unit

    Permalink

    Delete data from Cassandra table, using data from the stream as a list of primary keys.

    Delete data from Cassandra table, using data from the stream as a list of primary keys. Uses the specified column names.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    deleteColumns

    The list of column names to delete, empty ColumnSelector means full row.

    keyColumns

    Primary key columns selector, Optional. All RDD primary columns columns will be checked by default

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Definition Classes
    DStreamFunctionsWritableToCassandra
    See also

    com.datastax.spark.connector.writer.WritableToCassandra

  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  14. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  15. def joinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[T], rwf: RowWriterFactory[T]): DStream[(T, R)]

    Permalink

    Transforms RDDs with com.datastax.spark.connector.RDDFunctions.joinWithCassandraTable for each produced RDD

  16. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  17. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  18. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  19. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  20. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  25. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  26. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  27. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  28. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  29. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  30. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  31. def repartitionByCassandraReplica(keyspaceName: String, tableName: String, partitionsPerHost: Int = 10, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(conf), currentType: ClassTag[T], rwf: RowWriterFactory[T]): DStream[T]

    Permalink

    Transforms RDDs with com.datastax.spark.connector.RDDFunctions.repartitionByCassandraReplica for each produced RDD.

  32. def saveToCassandra(keyspaceName: String, tableName: String, columnNames: ColumnSelector = AllColumns, writeConf: WriteConf = WriteConf.fromSparkConf(conf))(implicit connector: CassandraConnector = CassandraConnector(conf), rwf: RowWriterFactory[T]): Unit

    Permalink

    Performs com.datastax.spark.connector.writer.WritableToCassandra for each produced RDD.

    Performs com.datastax.spark.connector.writer.WritableToCassandra for each produced RDD. Uses specific column names with an additional batch size.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    columnNames

    The list of column names to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Definition Classes
    DStreamFunctionsWritableToCassandra
  33. def sparkContext: SparkContext

    Permalink
    Definition Classes
    DStreamFunctionsWritableToCassandra
  34. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  35. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  36. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. def warnIfKeepAliveIsShort(): Unit

    Permalink

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from WritableToCassandra[T]

Inherited from AnyRef

Inherited from Any

Ungrouped