Gitpulse
LatestReleasesStand-up
Merged
Size
L
Large: 500-1000 weighted lines
Change Breakdown
Feature70%
Bug Fix30%
#3137feat(server): Gracefully handle oversized batch items instead of aborting the stream

Oversized batch items isolated to prevent stream failures

Oversized batch items isolated to prevent stream failures
ER
ericallam
·Feb 27, 2026·#3137feat(server): Gracefully handle oversized batch items instead of aborting the stream

Large payloads no longer abort entire batch processes. Individual oversized items are now marked as failed while the rest of the batch stream continues normally.

Large batches containing oversized payloads can now be processed without aborting the entire operation. Previously, the batch was halted and exponential backoff retries were triggered when a single oversized item caused a stream parsing error.

Items that exceed the maximum size limit are now identified and processed as individually failed runs. The remnants of chunked oversized lines are correctly discarded, which prevents parsing errors on subsequent items in the stream.

Raw byte streams are scanned to extract necessary routing metadata from oversized items without loading them fully into memory. These oversized items are then gracefully routed to a pre-failed state, and the details page is updated to display the status and ID of these runs clearly.

View Original GitHub Description

Gracefully handle oversized batch items instead of aborting the stream.

When an NDJSON batch item exceeds the maximum size, the parser now emits an error marker instead of throwing, allowing the batch to seal normally. The oversized item becomes a pre-failed run with PAYLOAD_TOO_LARGE error code, while other items in the batch process successfully. This prevents batchTriggerAndWait from seeing connection errors and retrying with exponential backoff.

Also fixes the NDJSON parser not consuming the remainder of an oversized line split across multiple chunks, which caused "Invalid JSON" errors on subsequent lines.

© 2026 · via Gitpulse