Beta
Optional
Alpha
codecsOptional
enableBy default, large numbers (such as bigint
and BigNumber) are disabled during serialization and deserialization.
This means that attempts to serialize such numbers will result in errors, and they may lose precision during deserialization.
To enable big numbers, you may set configure this option to select which numerical type each field is deserialized to.
This errorful behavior exists for two primary reasons:
9007199254740992
is equally representable as either a number
, bigint
, a BigNumber
, or even a string
.Luckily, there is no such ambiguity in serialization, as any number is just a series of digits in JSON.
Deserialization behavior must be configured to enable big numbers on a collection-by-collection basis.
Serialization behavior requires no such configuration, as there is no serialization ambiguity as aforementioned.
This option can be configured in two ways:
The coercion type itself (a CollNumCoercion) is either:
See CollNumCoercion for the different coercion types, and any additional caveats on a per-type basis.
The following example uses bigint
for monetary fields, and number
s for all other fields.
It's heavily recommended that you read the documentation for CollNumCoercion to understand the implications of each coercion type.
interface Order {
discount: bigint,
statusCode: number,
items: {
productID: UUID,
quantity: number,
price: BigNumber,
}[],
}
const orders = db.collection<Order>('orders', {
serdes: {
enableBigNumbers: {
'*': 'number',
'discount': 'bigint',
'items.*.price': 'bignumber',
},
},
});
const { insertedId } = await orders.insertOne({
discount: 123n,
statusCode: 1,
items: [
{
productID: uuid.v4(),
quantity: 2,
price: BigNumber(100),
},
],
});
const order = await orders.findOne({ _id: insertedId });
console.log(order.discount); // 123n
console.log(order.statusCode); // 1
console.log(order.items[0].price); // BigNumber(100)
Optional
mutateEnables an optimization which allows inserted rows/documents to be mutated in-place when serializing.
The feature is stable; however, the state of any document after being serialized is not guaranteed.
This will mutate filters and update filters as well.
For example, when you insert a record like so:
import { uuid } from '@datastax/astra-db-ts';
await collection.insertOne({ name: 'Alice', friends: { john: uuid('...') } });
The document is internally serialized as such:
{ name: 'Alice', friends: { john: { $uuid: '...' } } }
To avoid mutating a user-provided object, the client will be forced to clone any objects that contain a custom datatype, as well as their parents (which looks something like this):
{ ...original, friends: { ...original.friends, john: { $uuid: '...' } } }
This can be a minor performance hit, especially for large objects, so if you're confident that you won't be needing the object after it's inserted, you can enable this option to avoid the cloning, and instead mutate the object in-place.
// Before
const collection = db.collection<User>('users');
const doc = { name: 'Alice', friends: { john: uuid.v4() } };
await collection.insertOne(doc);
console.log(doc); // { name: 'Alice', friends: { john: UUID<4>('...') } }
// After
const collection = db.collection<User>('users', {
serdes: { mutateInPlace: true },
});
const doc = { name: 'Alice', friends: { john: UUID.v4() } };
await collection.insertOne(doc);
console.log(doc); // { name: 'Alice', friends: { john: { $uuid: '...' } } }
false
Overview (Alpha)
Provides a structured interface for integrating custom serialization/deserialization logic for documents/rows, filters, ids, etc.
You may create implementations of these codecs through the TableCodecs and CollectionCodecs classes.
See TableSerDesConfig.codecs & CollectionSerDesConfig.codecs for much more information.
Disclaimer
Codecs are a powerful feature, but should be used with caution. It's possible to break the client's behavior by using the features incorrectly.
Always test your codecs with a variety of documents to ensure that they behave as expected, before using them on real data.