A TransformStream that batches AppendRecords based on time, record count, and byte size.
Input: AppendRecord (individual records) Output: { records: AppendRecord[], fencing_token?: string, match_seq_num?: number }
const batcher = new BatchTransform<"string">({ lingerDurationMillis: 20, maxBatchRecords: 100, maxBatchBytes: 256 * 1024, match_seq_num: 0 // Optional: auto-increments per batch});// Pipe through the batcher and session to get acksreadable.pipeThrough(batcher).pipeThrough(session).pipeTo(writable);// Or use manuallyconst writer = batcher.writable.getWriter();writer.write(AppendRecord.make("foo"));await writer.close();for await (const batch of batcher.readable) { console.log(`Got batch of ${batch.records.length} records`);} Copy
const batcher = new BatchTransform<"string">({ lingerDurationMillis: 20, maxBatchRecords: 100, maxBatchBytes: 256 * 1024, match_seq_num: 0 // Optional: auto-increments per batch});// Pipe through the batcher and session to get acksreadable.pipeThrough(batcher).pipeThrough(session).pipeTo(writable);// Or use manuallyconst writer = batcher.writable.getWriter();writer.write(AppendRecord.make("foo"));await writer.close();for await (const batch of batcher.readable) { console.log(`Got batch of ${batch.records.length} records`);}
Optional
Readonly
MDN Reference
A TransformStream that batches AppendRecords based on time, record count, and byte size.
Input: AppendRecord (individual records) Output: { records: AppendRecord[], fencing_token?: string, match_seq_num?: number }
Example