Node Stream API
Stream manipulation utilities for size limiting, buffering, and merging.
What It Does
The Node Stream API provides specialized utilities for Node.js stream processing, including size-limited uploads, stream buffering to strings, and stream merging for data pipelines.
Key Capabilities
- Size Limiting: Enforce upload size limits with automatic stream destruction
- Stream Buffering: Convert readable streams to complete strings
- Stream Merging: Combine multiple streams sequentially
- Flexible Size Parsing: Support human-readable sizes (100mb, 2gb)
- Error Integration: E11Error format for size limit violations
Main Utilities
limitSize
limitSize(sizeLimit: string): TransformCreates Transform stream that enforces file size limits:
- Monitors cumulative chunk sizes
- Destroys stream with SizeLimitError if exceeded
- Supports formats: “10mb”, “2gb”, etc.
bufferStream
async bufferStream(stream: Readable): Promise<string>Accumulates stream into complete UTF-8 string:
- Handles streaming errors
- Returns complete data as Promise
- Memory-efficient for moderate-sized data
mergeStreams
mergeStreams(streams: Readable[]): ReadableCombines multiple streams sequentially:
- Uses async generators
- Sequential data yielding
- Single merged output stream
SizeLimitError
Custom error class with E11Error integration:
- Type:
SIZE_LIMIT_REACHED - Includes size limit in error data
Usage Example
// Limit upload size
const limitedStream = uploadStream.pipe(limitSize('100mb'))
// Buffer stream to string
const content = await bufferStream(fileStream)
// Merge multiple data sources
const combined = mergeStreams([stream1, stream2, stream3])Common Use Cases
- File upload size validation
- Stream consumption to strings
- Multi-source data combination
- Memory-safe stream processing
- Upload protection
What Customers Don’t Have to Build
- Size limit enforcement
- Stream buffering logic
- Stream merging utilities
- Size parsing (mb, gb, etc.)
- Error handling for streams
- Transform stream creation
Last updated on