Internal
Use Db.table to obtain an instance of this class.
Private
Readonly
#commandsPrivate
Readonly
#dbPrivate
Readonly
#httpInternal
internalReadonly
keyspaceThe keyspace where the table resides in.
It is up to the user to ensure that this keyspace really exists, and that this table is in it.
Readonly
nameThe user-provided, case-sensitive. name of the table
This is unique among all tables and collections in its keyspace, but not necessarily unique across the entire database.
It is up to the user to ensure that this table really exists.
Backdoor to the HTTP client for if it's absolutely necessary. Which it almost never (if even ever) is.
Performs one of the six available table-alteration operations:
add
(adds columns to the table)drop
(removes columns from the table)addVectorize
(enabled auto-embedding-generation on existing vector columns)dropVectorize
(disables vectorize on existing enabled columns)addReranking
(enables reranking on the table)dropReranking
(disables reranking on the table)See AlterTableOptions as well for more information.
The options for this operation.
The table with the new schema type.
interface User {
id: UUID,
vector: DataAPIVector,
}
const table = db.table<User>('users');
// Add a column to the table
type NewUser = User & { name: string };
const newTable = await table.alter<NewUser>({
operation: {
add: {
columns: { name: 'text' },
},
},
});
// Drop a column from the table (resets it to how it was originally)
const oldTable = await newTable.alter<User>({
operation: {
drop: {
columns: ['name'],
},
},
});
Table
The alter
operation returns the table itself, with the new schema type.
It is heavily recommended to store the result of the alter
operation in a new variable, as the old table will not have the new schema type.
You should provide the exact new type of the schema, or it'll just default to SomeRow
.
TableAlterTableOptions
Creates a secondary index on the table.
The operation blocks until the index is created and ready to use.
See Table.createVectorIndex for creating vector indexes, and Table.createTextIndex for creating lexical indexes.
text
and ascii
-based indexes have access to a few additional options:
caseSensitive
(default: true
)normalize
(default: true
)ascii
(default: false
)The name of the index
The column to index
Optional
options: TableCreateIndexOptionsOptions for this operation
A promise which resolves once the index is created.
Creates a lexical index on an existing text column in the table.
The operation blocks until the index is created and ready to use.
See Table.createIndex for creating non-lexical indexes.
The name of the index
The text column to index
Optional
options: TableCreateTextIndexOptionsOptions for this operation
A promise which resolves once the index is created.
Creates an index on an existing vector column in the table.
The operation blocks until the index is created and ready to use.
See Table.createIndex for creating non-vector indexes.
The name of the index
The vector column to index
Optional
options: TableCreateVectorIndexOptionsOptions for this operation
A promise which resolves once the index is created.
Get the table definition, i.e. it's columns and primary key definition.
The method issues a request to the Data API each time it is invoked, without caching mechanisms; this ensures up-to-date information for usages such as real-time table validation by the application.
Optional
options: WithTimeout<"tableAdminTimeoutMs">The options for this operation.
The definition of the table.
const definition = await table.definition();
console.log(definition.columns);
Atomically deletes many rows from the table.
See TableFilter and TableDeleteManyOptions as well for more information.
A filter to select the row(s) to delete.
Optional
options: GenericDeleteManyOptionsThe options for this operation.
A promise which resolves once the operation is completed.
await table.insertMany([
{ pk: 'abc', ck: 1, name: 'John' },
{ pk: 'abc', ck: 2, name: 'Jane' },
]);
await table.deleteMany({ pk: 'abc' });/
There are different forms of accepted filters:
partitionSort
columns not provided🚨Important: If an empty filter is passed, all rows in the tables will table be deleted in a single API call. Proceed with caution.
Deletes a single row from the table.
See TableFilter and TableDeleteOneOptions as well for more information.
A filter to select the row to delete.
Optional
options: TableDeleteOneOptionsThe options for this operation.
A promise which resolves once the operation is completed.
await table.insertOne({ pk: 'abc', ck: 3 });
await table.deleteOne({ pk: 'abc', ck: 3 });
🚨Important: The filter must contain an exact primary key to delete a row.
Attempting to pass an empty filter, filtering by only part of the primary key, or filtering by a non-primary key column will result in an error.
Optional
options: Omit<DropTableOptions, keyof WithKeyspace>The options for this operation.
A promise which resolves when the table has been dropped.
const table = await db.table('my_table');
await table.drop();
🚨Important: Once the table is dropped, this object is still technically "usable", but any further operations on it will fail at the Data API level; thus, it's the user's responsibility to make sure that the Table object is no longer used.
Use with caution. Wear your safety goggles. Don't say I didn't warn you.
Find rows in the table, optionally matching the provided filter.
See TableFilter, TableFindOptions, and FindCursor as well for more information.
A filter to select the rows to find. If not provided, all rows will be returned.
Optional
options: GenericFindOptionsThe options for this operation.
a FindCursor which can be iterated over.
const cursor = await table.find({ name: 'John Doe' });
const docs = await cursor.toArray();
🚨Important: When projecting, it is heavily recommended to provide an explicit type override representing the projected schema, to prevent any type-mismatches when the schema is strictly provided.
Otherwise, the rows will be typed as the full
Schema
, which may lead to runtime errors when trying to access properties that are not present in the projected rows.
💡Tip: Use the Pick or Omit utility types to create a type representing the projected schema.
interface User {
id: string,
name: string,
car: { make: string, model: string },
}
const table = db.table<User>('users');
// --- Not providing a type override ---
const cursor = await table.find({}, {
projection: { car: 1 },
});
const next = await cursor.next();
console.log(next.car.make); // OK
console.log(next.name); // Uh oh! Runtime error, since tsc doesn't complain
// --- Explicitly providing the projection type ---
const cursor = await table.find<Pick<User, 'car'>>({}, {
projection: { car: 1 },
});
const next = await cursor.next();
console.log(next.car.make); // OK
console.log(next.name); // Type error; won't compile
The filter can contain a variety of operators & combinators to select the rows. See TableFilter for much more information.
⚠️Warning: If the filter is empty, all rows in the table will be returned (up to any provided or server limit).
If the table has vector search enabled, you can find the most relevant rows by providing a vector in the sort option.
Vector ANN searches cannot return more than a set number of rows, which, at the time of writing, is 1000 items.
await table.insertMany([
{ name: 'John Doe', vector: [.12, .52, .32] },
{ name: 'Jane Doe', vector: [.32, .52, .12] },
{ name: 'Dane Joe', vector: [.52, .32, .12] },
]);
const cursor = table.find({}, {
sort: { vector: [.12, .52, .32] },
});
// Returns 'John Doe'
console.log(await cursor.next());
The sort option can be used to sort the rows returned by the cursor. See Sort for more information.
If the sort option is not provided, there is no guarantee as to the order of the rows returned.
🚨Important: When providing a non-vector sort, the Data API will return a smaller number of rows (20, at the time of writing), and stop there. The returned rows are the top results across the whole table according to the requested criterion.
await table.insertMany([
{ name: 'John Doe', age: 1, height: 168 },
{ name: 'John Doe', age: 2, height: 42 },
]);
const cursor = table.find({}, {
sort: { age: 1, height: -1 },
});
// Returns 'John Doe' (age 2, height 42), 'John Doe' (age 1, height 168)
console.log(await cursor.toArray());
Other available options include skip
, limit
, includeSimilarity
, and includeSortVector
. See TableFindOptions and FindCursor for more information.
If you prefer, you may also set these options using a fluent interface on the FindCursor itself.
// cursor :: FindCursor<string>
const cursor = table.find({})
.sort({ vector: [.12, .52, .32] })
.projection<{ name: string, age: number }>({ name: 1, age: 1 })
.includeSimilarity(true)
.map(doc => `${doc.name} (${doc.age})`);
When not specifying sorting criteria at all (by vector or otherwise), the cursor can scroll through an arbitrary number of rows as the Data API and the client periodically exchange new chunks of rows.
--
It should be noted that the behavior of the cursor in the case rows
have been added/removed after the find
was started depends on database
internals, and it is not guaranteed, nor excluded, that such "real-time"
changes in the data would be picked up by the cursor.
Find a single row in the table, optionally matching the provided filter.
See TableFilter and TableFindOneOptions as well for more information.
A filter to select the rows to find. If not provided, all rows will be returned.
Optional
options: GenericFindOneOptionsThe options for this operation.
A row matching the criterion, or null
if no such row exists.
const doc = await table.findOne({ name: 'John Doe' });
🚨Important: When projecting, it is heavily recommended to provide an explicit type override representing the projected schema, to prevent any type-mismatches when the schema is strictly provided.
Otherwise, the rows will be typed as the full
Schema
, which may lead to runtime errors when trying to access properties that are not present in the projected rows.
💡Tip: Use the Pick or Omit utility types to create a type representing the projected schema.
interface User {
id: string,
name: string,
car: { make: string, model: string },
}
const table = db.table<User>('users');
// --- Not providing a type override ---
const row = await table.findOne({}, {
projection: { car: 1 },
});
console.log(row.car.make); // OK
console.log(row.name); // Uh oh! Runtime error, since tsc doesn't complain
// --- Explicitly providing the projection type ---
const row = await table.findOne<Pick<User, 'car'>>({}, {
projection: { car: 1 },
});
console.log(row.car.make); // OK
console.log(row.name); // Type error; won't compile
The filter can contain a variety of operators & combinators to select the row. See TableFilter for much more information.
⚠️Warning: If the filter is empty, and no Sort is present, it's undefined as to which row is selected.
If the table has vector search enabled, you can find the most relevant row by providing a vector in the sort option.
await table.insertMany([
{ name: 'John Doe', vector: [.12, .52, .32] },
{ name: 'Jane Doe', vector: [.32, .52, .12] },
{ name: 'Dane Joe', vector: [.52, .32, .12] },
]);
const doc = table.findOne({}, {
sort: { vector: [.12, .52, .32] },
});
// 'John Doe'
console.log(doc.name);
The sort option can be used to pick the most relevant row. See Sort for more information.
If the sort option is not provided, there is no guarantee as to which of the rows which matches the filter is returned.
await table.insertMany([
{ name: 'John Doe', age: 1, height: 168 },
{ name: 'John Doe', age: 2, height: 42 },
]);
const doc = table.findOne({}, {
sort: { age: 1, height: -1 },
});
// 'John Doe' (age 2, height 42)
console.log(doc.name);
Other available options include includeSimilarity
. See TableFindOneOptions for more information.
If you want to get skip
or includeSortVector
as well, use Table.find with a limit: 1
instead.
const doc = await cursor.findOne({}, {
sort: { vector: [.12, .52, .32] },
includeSimilarity: true,
});
Upserts many rows into the table.
See TableInsertManyOptions and TableInsertManyResult as well for more information.
The rows to insert.
Optional
options: GenericInsertManyOptionsThe options for this operation.
The primary keys of the inserted rows (and the count)
import { uuid } from '@datastax/astra-db-ts';
await table.insertMany([
{ id: uuid.v4(), name: 'John Doe' }, // or UUID.v4()
{ id: uuid.v7(), name: 'Jane Doe' },
]);
🚨Important: This function inserts rows in chunks to avoid exceeding insertion limits, which means it may make multiple requests to the server. As a result, this operation is not necessarily atomic.
If the dataset is large or the operation is ordered, it may take a relatively significant amount of time. During this time, rows inserted by other concurrent processes may be written to the database, potentially causing duplicate id conflicts. In such cases, it's not guaranteed which write will succeed.
By default, it inserts rows in chunks of 50 at a time. You can fine-tune the parameter through the chunkSize
option. Note that increasing chunk size won't always increase performance. Instead, increasing concurrency may help.
You can set the concurrency
option to control how many network requests are made in parallel on unordered insertions. Defaults to 8
.
const rows = Array.from({ length: 100 }, (_, i) => ({ id: i }));
await table.insertMany(rows, { concurrency: 16 });
🚨Important: When inserting a row with a primary key that already exists, the new row will be merged with the existing row, with the new values taking precedence.
If you want to delete old values, you must explicitly set them to
null
(notundefined
).
// Since insertion is ordered, the last unique value for each
// primary key will be the one that remains in the table.
await table.insertMany([
{ id: '123', col1: 'I exist' },
{ id: '123', col1: `I'm new` },
{ id: '123', col2: 'me2' },
], { ordered: true });
await table.findOne({ id: '123' }); // { id: '123', col1: 'I'm new', col2: 'me2' }
// Since insertion is unordered, it is not entirely guaranteed
// which value will remain in the table for each primary key,
// as concurrent insertions may occur.
await table.insertMany([
{ id: '123', col1: null },
{ id: '123', col1: 'hi' },
]);
// coll1 may technically be either 'hi' or null
await table.findOne({ id: '123' }); // { id: '123', col1: ? }
You may set the ordered
option to true
to stop the operation after the first error; otherwise rows may be parallelized and processed in arbitrary order, improving, perhaps vastly, performance.
Setting the ordered
operation disables any parallelization so insertions truly are stopped after the very first error.
Setting ordered
also guarantees the order of the aforementioned upsert behavior.
The type of the primary key of the table is inferred from the second PKey
type-param of the table.
If not present, it defaults to Partial<RSchema>
to keep the result type consistent.
interface User {
id: string,
name: string,
dob?: DataAPIDate,
}
type UserPKey = Pick<User, 'id'>;
const table = db.table<User, UserPKey>('table');
// res.insertedIds is of type { id: string }[]
const res = await table.insertMany([
{ id: '123', thing: 'Sunrise' },
{ id: '456', thing: 'Miso soup' },
]);
console.log(res.insertedIds[0].id); // '123'
TableInsertManyError
If some rows can't be inserted, (e.g. they have the wrong data type for a column or lack the primary key), the Data API validation check will fail for those entire specific requests containing the faulty rows.
Depending on concurrency & the ordered
parameter, some rows may still have been inserted.
In such cases, the operation will throw a TableInsertManyError containing the partial result.
If a thrown exception is not due to an insertion error, e.g. a 5xx
error or network error, the operation will throw the underlying error.
In case of an unordered request, if the error was a simple insertion error, the TableInsertManyError will be thrown after every row has been attempted to be inserted. If it was a 5xx
or similar, the error will be thrown immediately.
TableInsertManyError - If the operation fails.
Atomically upserts a single row into the table.
See TableInsertOneOptions and TableInsertOneResult as well for more information.
The row to insert.
Optional
options: GenericInsertOneOptionsThe options for this operation.
The primary key of the inserted row.
import { UUID, vector, ... } from '@datastax/astra-db-ts';
// Insert a row with a specific ID
await table.insertOne({ id: 'text-id', name: 'John Doe' });
await table.insertOne({ id: UUID.v7(), name: 'Dane Joe' }); // or uuid.v7()
// Insert a row with a vector
// DataAPIVector class enables faster ser/des
const vec = vector([.12, .52, .32]); // or new DataAPIVector([.12, .52, .32])
await table.insertOne({ id: 1, name: 'Jane Doe', vector: vec });
// or if vectorize (auto-embedding-generation) is enabled for the column
await table.insertOne({ id: 1, name: 'Jane Doe', vector: "Hey there!" });
When inserting a row with a primary key that already exists, the new row will be merged with the existing row, with the new values taking precedence.
If you want to delete old values, you must explicitly set them to null
(not undefined
).
await table.insertOne({ id: '123', col1: 'I exist' });
await table.findOne({ id: '123' }); // { id: '123', col1: 'I exist' }
await table.insertOne({ id: '123', col1: 'I am new' });
await table.findOne({ id: '123' }); // { id: '123', col1: 'I am new' }
await table.insertOne({ id: '123', col2: 'me2' });
await table.findOne({ id: '123' }); // { id: '123', col1: 'I am new', col2: 'me2' }
await table.insertOne({ id: '123', col1: null });
await table.findOne({ id: '123' }); // { id: '123', col2: 'me2' }
The type of the primary key of the table is inferred from the second PKey
type-param of the table.
If not present, it defaults to Partial<RSchema>
to keep the result type consistent.
interface User {
id: string,
name: string,
dob?: DataAPIDate,
}
type UserPKey = Pick<User, 'id'>;
const table = db.table<User, UserPKey>('table');
// res.insertedId is of type { id: string }
const res = await table.insertOne({ id: '123', name: 'Alice' });
console.log(res.insertedId.id); // '123'
Unsubscribe from an event.
The event to unsubscribe from.
The listener to remove.
Subscribe to an event.
The event to listen for.
The callback to invoke when the event is emitted.
A function to unsubscribe the listener.
Subscribe to an event once.
The listener will be automatically unsubscribed after the first time it is called.
Note that the listener will be unsubscribed BEFORE the actual listener callback is invoked.
The event to listen for.
The callback to invoke when the event is emitted.
A function to prematurely unsubscribe the listener.
Remove all listeners for an event.
If no event is provided, all listeners for all events will be removed.
Optional
eventName: EThe event to remove listeners for.
Updates a single row in the table. Under certain conditions, it may insert or delete a row as well.
See TableFilter, TableUpdateFilter, and TableUpdateOneOptions as well for more information.
A filter to select the row to update.
The update to apply to the selected row.
Optional
options: TableUpdateOneOptionsThe options for this operation.
A promise which resolves once the operation is completed.
await table.insertOne({ key: '123', name: 'Jerry' });
await table.updateOne({ key: '123' }, { $set: { name: 'Geraldine' } });
🚨Important: The filter must contain an exact primary key to update a row.
Attempting to pass an empty filter, filtering by only part of the primary key, or filtering by a non-primary key column will result in an error.
If the row doesn't exist, and you're $set
-ing at least one row to a non-null value, an upsert will occur.
// No upsert will occur here since only nulls are being set
// (this is equivalent to `{ $unset: { name: '' } }`)
await table.updateOne({ key: '123' }, { $set: { name: null } });
// An upsert will occur here since at least one non-null value is being set
await table.updateOne({ key: '123' }, { $set: { name: 'Eleanor', age: null } });
Updates may perform either $set
or $unset
operations on the row.
✏️Note:
$set
-ing a row tonull
is equivalent to$unset
-ing it.
If a row was only ever upserted, and all of its non-primary fields are later set to null
(or unset), the row will be deleted.
However, if the row was explicitly inserted at any point—even if it was originally upserted—it will not be deleted in this way.
// Upserts row { key: '123', name: 'Michael', age: 3 } into the table
await table.updateOne({ key: '123' }, { $set: { name: 'Michael', age: 3 } });
// Sets row to { key: '123', name: 'Michael', age: null }
// (Would be the same with $unset)
await table.updateOne({ key: '123' }, { $set: { age: null } });
// Deletes row from the table as all non-primary keys are set to null
// (Would be the same with $unset)
await table.updateOne({ key: '123' }, { $set: { name: null } });
Static
schemaStrongly types the creation of a const
new CreateTableDefinition schema.
Unlike writing the table definition inline in createTable
and using InferTableSchema
on the Table
itself, this method:
Similar to using const Schema = { ... } as const [satisfies CreateTableDefinition<any>]
.
The schema to strongly type.
The exact same object passed in. This method simply exists for the strong typing.
// Define the table schema
const UserSchema = Table.schema({
columns: {
name: 'text',
dob: {
type: 'timestamp',
},
friends: {
type: 'set',
valueType: 'text',
},
},
primaryKey: {
partitionBy: ['name', 'height'], // type error: 'height' is not a valid column
partitionSort: { dob: 1 },
},
});
// Type inference is as simple as that
type User = InferTableSchema<typeof UserSchema>;
// And now `User` can be used wherever.
const main = async () => {
const table = await db.createTable('users', { definition: UserSchema });
const found: User | null = await table.findOne({});
};
Overview
Represents the interface to a table in a Data-API-enabled database.
Example
Typing the table
A
Table
is typed asTable<WSchema, PKey, RSchema>
, where:WSchema
is the type of the row as it's written to the table (the "write" schema)PKey
(optional) is the type of the primary key of the table as it's returnedRSchema
is the type of the row as it's read from the table (the "read" schema)WSchema
FoundRow<WSchema>
(see FoundRow)Typing the primary key
The primary key of the table should be provided as a second type parameter to
Table
.This is a special type that is used to reconstruct the TS type of the primary key in insert operations. It should be an object with the same keys as the primary key columns, and the same types as the schema.
Note that there is no distinction between partition and clustering keys in this type.
Example
db.createTable
type inferenceExample
Datatypes
Certain datatypes may be represented as TypeScript classes (some native, some provided by the client).
For example:
'map<k, v>'
is represented by a native JS Map'vector'
is represented by anastra-db-ts
provided DataAPIVector'date'
is represented by anastra-db-ts
provided DataAPIDateYou may also provide your own datatypes by providing some custom serialization logic as well (see later section).
Example
The full list of relevant datatypes (for tables) includes: DataAPIBlob, DataAPIDate, DataAPITime, DataAPIVector, DataAPIInet, DataAPIDuration, UUID, Map, Set, and BigNumber.
Big numbers disclaimer
When
varint
s ordecimal
s are present in the schema (when you're serializingbigint
s and BigNumbers), it will automatically enable usage of a bignumber-friendly JSON library which is capable of serializing/deserializing these numbers without loss of precision, but is much slower than the native JSON library (but, realistically, the difference is likely negligible).Custom datatypes
You can plug in your own custom datatypes, as well as enable many other features by providing some custom serialization/deserialization logic through the
serdes
option in TableOptions, DbOptions, and/or DataAPIClientOptions.dbOptions.Note however that this is currently not entirely stable, and should be used with caution.
🚨Disclaimers
It is on the user to ensure that the TS type of the
Table
corresponds with the actual CQL table schema, in its TS-deserialized form. Incorrect or dynamic tying could lead to surprising behaviors and easily-preventable errors.See Db.createTable, Db.table, and InferTableSchema for much more information about typing.
See