Batch Processing
Batch processing sends an array of records through a flow with controlled concurrency, per-record status tracking, and partial failure handling.
Batch trigger
Section titled “Batch trigger”Start a batch job via the CLI, API, or MCP. Records are processed through an existing flow.
Via MCP
Section titled “Via MCP”{ "name": "start_batch", "arguments": { "flow_id": "...", "records": [ {"order_id": "ORD-001", "total": 100.00}, {"order_id": "ORD-002", "total": 200.00}, {"order_id": "ORD-003", "total": 300.00} ], "concurrency": 10 }}| Parameter | Type | Default | Description |
|---|---|---|---|
flow_id | uuid | — | Flow to process records through |
records | array | — | Array of record objects |
concurrency | 1-50 | 5 | Max concurrent records |
Per-record processing
Section titled “Per-record processing”Each record in the batch is processed independently through the flow. Records track individual status:
| Status | Description |
|---|---|
pending | Not yet processed |
processing | Currently being processed |
succeeded | Processed successfully |
failed | Processing failed |
retried_success | Failed initially, succeeded on retry |
retried_failed | Failed on retry as well |
skipped | Skipped due to max_failures limit |
Partial failure handling
Section titled “Partial failure handling”By default, batch processing uses continue_on_error: true — when a single record fails, the remaining records continue processing. Failed records are tracked individually with their error messages and original payloads, so you can inspect and retry them later without reprocessing the entire batch.
Configure failure behavior in the batch job request:
batch: continue_on_error: true # default — keep processing after failures on_record_failure: log # log | dead-letter | callback max_failures: 100 # stop the batch after this many failures| Field | Type | Default | Description |
|---|---|---|---|
continue_on_error | boolean | true | When false, the batch stops on the first failure |
on_record_failure | string | log | Action per failed record: log records the error, dead-letter sends the record to a dead-letter queue, callback posts the failure to a webhook |
max_failures | integer | unlimited | Hard cap on failures — once reached, remaining records are marked skipped and the batch job status becomes failed |
When max_failures is reached, records already in-flight finish processing, but no new records are started. This prevents runaway failures from consuming resources when something is systematically wrong (e.g., a target API is down).
Batch job statuses
Section titled “Batch job statuses”| Status | Description |
|---|---|
pending | Job created, not started |
processing | Records being processed |
completed | All records succeeded |
completed_with_errors | Some records failed |
failed | Job-level failure |
cancelled | Job was cancelled |
Monitoring batch jobs
Section titled “Monitoring batch jobs”Check status
Section titled “Check status”{ "name": "get_batch_status", "arguments": { "batch_job_id": "..." }}Returns: total records, processed, succeeded, failed, pending counts.
List recent jobs
Section titled “List recent jobs”{ "name": "list_batch_jobs", "arguments": { "flow_id": "..." }}Export errors
Section titled “Export errors”{ "name": "export_batch_errors", "arguments": { "batch_job_id": "..." }}Returns failed records with their error messages and original payloads.
Retry failed records
Section titled “Retry failed records”{ "name": "retry_batch_failed_records", "arguments": { "batch_job_id": "..." }}Re-processes failed records at lower concurrency.
Example: Import customer records
Section titled “Example: Import customer records”Say you have a CSV export of 1,000 customer records to import into your CRM. Define a flow that maps CSV fields to the CRM’s customer endpoint, then start a batch job with all 1,000 records. fyrn processes them at the configured concurrency (e.g., 10 at a time), tracks each record individually, and retries transient failures with exponential backoff. When the batch completes, check for any remaining failures with fyrn logs search and retry them — records that failed due to temporary issues (rate limits, timeouts) often succeed on the second pass.
flow: customer-importversion: 1source: connector: import-service trigger: manualtarget: connector: crm-system endpoint: /api/customers method: POSTmapping: name: source.name | trim | required email: source.email | lowercase | required phone: source.phone | default("") company: source.company | omit_if_nullon_error: retry: 2x exponential(30s) then: dead-letter# Start the batch via MCP or API# Monitor progressfyrn logs search <flow-id> --status failed --since 1hPublished API batch mode
Section titled “Published API batch mode”When you publish a flow as an API endpoint, you can enable batch mode so callers can submit arrays of records in a single HTTP request. Add the batch fields to your published API configuration:
published_api: path: /api/v1/customers/import method: POST batchEnabled: true batchArrayPath: records # JSON path to the array in the request body batchMaxSize: 5000 # max records per request (default: 1000) batchConcurrency: 10 # concurrent record processing (default: 5)| Field | Type | Default | Description |
|---|---|---|---|
batchEnabled | boolean | false | Enable batch mode on this endpoint |
batchArrayPath | string | — | JSON path to the array of records in the request body (e.g., records, data.items) |
batchMaxSize | integer | 1000 | Maximum number of records accepted per request. Requests exceeding this return 400 |
batchConcurrency | integer | 5 | How many records to process in parallel (1–50) |
Callers POST a JSON body with the array at batchArrayPath. The endpoint returns a batch_job_id immediately (HTTP 202 Accepted) and processes records asynchronously. Callers poll the batch status endpoint or configure a webhook callback to receive completion notifications.
See the Published APIs guide for the full configuration details.
What’s next
Section titled “What’s next”- Scheduled Flows — Cron and poll triggers
- Testing Flows — Test before processing real data
- MCP Tools — All batch-related MCP tools