Serves the same purpose as BufferedIterator
in Scala, but its takeWhile
method
properly doesn't consume the next element.
Counts elements fetched form the underlying iterator.
Counts elements fetched form the underlying iterator. Limit causes iterator to terminate early
Class representing a Config Parameter no longer in use and it's replacement if any.
Class representing a Config Parameter no longer in use and it's replacement if any. Use rational to display more information about the deprecation and what to do for the end user.
Utility trait for classes that want to log data.
Utility trait for classes that want to log data. Creates a SLF4J logger for the class and allows logging messages at different levels using methods that only evaluate parameters lazily if the log level is enabled.
This is a copy of what Spark Previously held in org.apache.spark.Logging. That class is now private so we will expose similar functionality here.
A HashMap and a PriorityQueue hybrid.
A HashMap and a PriorityQueue hybrid. Works like a HashMap but offers additional O(1) access to the entry with the highest value. As in a standard HashMap, entries can be looked up by key in O(1) time. Adding, removing and updating items by key is handled in O(log n) time.
Keys must not be changed externally and must implement proper equals and hashCode. It is advised to use immutable classes for keys.
Values must be properly comparable.
Values may be externally mutated as long as a proper immediate call to put
is issued to notify the PriorityHashMap that the value associated with the given key
has changed, after each value mutation.
It is not allowed to externally mutate more than one value
at a time or to mutate a value associated with multiple keys.
Therefore, it is advised to use immutable classes for values, and updating
values only by calls to put
.
Contrary to standard Java HashMap implementation, PriorityHashMap does not allocate memory on adding / removing / updating items and stores all data in flat, non-resizable arrays instead. Therefore its capacity cannot be modified after construction. It is technically possible to remove this limitation in the future.
PriorityHashMap is mutable and not thread-safe.
Internally, PriorityHashMap is composed of the following data arrays: - an array storing references to keys, forming a heap-based priority queue; - an array storing corresponding references to values, always in the same order as keys; - an array storing indexes into the first two arrays, used as an inline hash-table allowing to quickly locate keys in the heap in constant time; - an array for fast translating indexes in the heap into indexes into hash-table, so after moving a key/value in the heap, the corresponding index in the hash-table can be quickly updated, without hashing.
The indexes hash-table doesn't use overflow lists for dealing with hash collisions. The overflow entries are placed in the main hash-table array in the first not-taken entry to the right from the original position pointed by key hash. On search, if the key is not found immediately at a position pointed by key hash, it is searched to the right, until it is found or an empty array entry is found.
type of keys
type of values; values must be comparable
An iterator that groups items having the same value of the given function (key).
An iterator that groups items having the same value of the given function (key). To be included in the same group, items with the same key must be next to each other in the original collection.
SpanningIterator
buffers internally one group at a time and the wrapped iterator
is consumed in a lazy way.
Example:
val collection = Seq(1 -> "a", 1 -> "b", 1 -> "c", 2 -> "d", 2 -> "e") val iterator = new SpanningIterator(collection.iterator, (x: (Int, String)) => x._1) val result = iterator.toSeq // Seq(1 -> Seq("a", "b", "c"), 2 -> Seq("d", "e"))
Helper class to throw exceptions if there are environment variables in the spark.cassandra namespace which don't map to Spark Cassandra Connector known properties.
A helper class to make it possible to access components written in Scala from Java code.
A helper class to make it possible to access components written in Scala from Java code. INTERNAL API
Useful stuff that didn't fit elsewhere.