The Bulk / Batch / Import Operations Pattern is used to process multiple items in a single API request, allowing servers to efficiently create, update, or validate large sets of data.
This pattern helps address performance bottlenecks, rate limits, high network overhead, and data synchronization use cases.
This pattern is common when migrating data, performing nightly data syncs, importing from SaaS systems, or managing large product catalogs.
8.1. Overview
Bulk, batch, and asynchronous import operations allow clients to submit many items at once. These patterns help reduce:
- Excessive network chatter (N requests vs 1 request)
- Strain on rate limits
- Client complexity
- Latency for large ingest operations
It includes three related variants:
- Bulk Import – per-item processing with partial success
- Batch Import – all-or-nothing, transactional import
- Async Bulk Import – queuing and deferred processing of large datasets
The server processes many items efficiently and returns results that reflect the processing semantics chosen.
Characteristics:
- Request includes multiple items
- Response reflects per-item outcomes
- Server may enforce size limits or streaming formats
- Supports partial success (bulk) or full rollbacks (batch)**
- Can optionally offload processing to async jobs
8.2. When to Use This Pattern
Use Bulk/Batch/Import operations when:
- You need to ingest many records at once (customer imports, product catalogs).
- Network round-trips for individual items would be too slow or expensive.
- Data synchronization occurs in batches, not single items.
- Partial success is acceptable (bulk variant).
- Full transactional integrity is required (batch variant).
- The dataset is too large for synchronous processing (async variant).
Choosing the Right Import Strategy
| Variant | Description | Semantics | When to Use |
|---|---|---|---|
| Bulk Import | Each item processed independently | Partial success allowed | Importing data where individual failures are expected / OK |
| Batch Import | All items must succeed; otherwise none are applied | All-or-nothing | When consistency/integrity matters more than throughput |
| Async Bulk Import | Import is queued and processed later | Deferred, often bulk | When payload is large or processing is slow/expensive |
8.3. When NOT to Use This Pattern
Do not use bulk/batch/import when:
- You only ever operate on single items (use CRUD).
- You have a workflow with states (use Extended CRUD).
- You are performing a command or computation (use Functional Resource).
- The server cannot safely process or validate large inputs.
- Real-time, low-latency updates are required (bulk jobs may take time).
Avoid using Bulk Import to hide:
- Poorly designed workflows
- Massive unbounded data ingestion
- Instantly-required user-facing actions
8.4. What the Pattern Looks Like
Bulk/batch/import operations appear in these common forms:
Bulk Import (Partial Success)
POST /customers/bulk-import- Processes each item independently
- Returns per-item success/failure results
- Often uses
207 Multi-Statusto indicate mixed outcomes
Batch Import (All-or-Nothing)
POST /customers/batch-import- Validates entire payload before writing
- If any item fails, none are persisted
- Returns
200on full success;400(or another 4xx) on batch failure, with per-item errors
Async Bulk Import (Queued)
-
POST /customers/import- Accepts a large payload or file
- Server queues operation, returns
202 Accepted + Location - Client polls
GET /imports/{importId}to check status and see per-item results
8.5. Anti-Patterns to Avoid
-
Conflating bulk vs batch semantics
- Clients must know if partial success is possible.
-
Returning only a single “success/failure” flag for an entire bulk job
- Clients need per-item results to fix bad records.
-
Accepting huge JSON arrays with no limits
- Risks timeouts and memory pressure; set max sizes and consider streaming/async.
-
Making all heavy bulk operations synchronous
- Large imports should usually be async.
-
Using bulk import to simulate a workflow
- Workflow state transitions belong in Extended CRUD, not bulk.
8.6. OpenAPI Examples
8.6.1. Bulk Import (Per-Item Processing)
openapi: 3.0.3
info:
title: Customers Bulk Import API - Bulk / Batch Pattern Example
version: 1.0.0
servers:
- url: https://api.example.com
paths:
/customers/bulk-import:
post:
summary: Bulk import or update customers (per-item processing)
tags: [Customers]
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/BulkCustomersRequest'
examples:
default:
value:
items:
- id: "cus_123"
email: "alice@example.com"
name: "Alice Doe"
- id: "cus_456"
email: "bob_at_example.com"
name: "Bob Smith"
responses:
'207':
description: Multi-Status response with per-item results
content:
application/json:
schema:
$ref: '#/components/schemas/BulkCustomersResults'
examples:
default:
value:
results:
- id: "cus_123"
status: SUCCESS
- id: "cus_456"
status: FAILED
errorCode: "EMAIL_INVALID"
errorMessage: "Email address is not valid."
'400':
description: Invalid request
components:
schemas:
BulkCustomersRequest:
type: object
properties:
items:
type: array
items:
$ref: '#/components/schemas/CustomerBulkItem'
required:
- items
CustomerBulkItem:
type: object
properties:
id:
type: string
example: "cus_123"
email:
type: string
example: "alice@example.com"
name:
type: string
example: "Alice Doe"
required:
- email
- name
BulkCustomersResults:
type: object
properties:
results:
type: array
items:
$ref: '#/components/schemas/CustomerBulkResult'
required:
- results
CustomerBulkResult:
type: object
properties:
id:
type: string
example: "cus_456"
status:
type: string
enum: [SUCCESS, FAILED]
example: "FAILED"
errorCode:
type: string
nullable: true
example: "EMAIL_INVALID"
errorMessage:
type: string
nullable: true
example: "Email address is not valid."
required:
- id
- status
8.6.2. Batch Import (All-or-Nothing)
openapi: 3.0.3
info:
title: Customers Batch Import API - Batch Variant Example
version: 1.0.0
servers:
- url: https://api.example.com
paths:
/customers/batch-import:
post:
summary: Batch import customers (all-or-nothing)
tags: [Customers]
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/BatchCustomersRequest'
examples:
default:
value:
items:
- id: "cus_123"
email: "alice@example.com"
name: "Alice Doe"
- id: "cus_456"
email: "bob_at_example.com"
name: "Bob Smith"
responses:
'200':
description: All customers imported successfully
content:
application/json:
schema:
$ref: '#/components/schemas/BatchCustomersResults'
examples:
default:
value:
items:
- id: "cus_123"
status: 200
errors: []
- id: "cus_456"
status: 200
errors: []
'400':
description: Batch failed; no customers imported
content:
application/json:
schema:
$ref: '#/components/schemas/BatchCustomersResults'
examples:
default:
value:
items:
- id: "cus_123"
status: 200
errors: []
- id: "cus_456"
status: 400
errors:
- code: "EMAIL_INVALID"
message: "Email address is not valid."
components:
schemas:
BatchCustomersRequest:
type: object
properties:
items:
type: array
items:
$ref: '#/components/schemas/CustomerBatchItem'
required:
- items
CustomerBatchItem:
type: object
properties:
id:
type: string
example: "cus_123"
email:
type: string
example: "alice@example.com"
name:
type: string
example: "Alice Doe"
required:
- email
- name
BatchCustomersResults:
type: object
properties:
items:
type: array
items:
$ref: '#/components/schemas/CustomerBatchResult'
required:
- items
CustomerBatchResult:
type: object
properties:
id:
type: string
example: "cus_456"
status:
type: integer
example: 400
errors:
type: array
items:
$ref: '#/components/schemas/CustomerBatchError'
required:
- id
- status
- errors
CustomerBatchError:
type: object
properties:
code:
type: string
example: "EMAIL_INVALID"
message:
type: string
example: "Email address is not valid."
required:
- code
- message
8.6.3. Async Bulk Import (Queued Processing)
openapi: 3.0.3
info:
title: Customers Async Import API - Bulk / Batch Pattern Example
version: 1.0.0
servers:
- url: https://api.example.com
paths:
/customers/import:
post:
summary: Submit customer import for asynchronous processing
tags: [Customers]
requestBody:
required: true
content:
multipart/form-data:
schema:
type: object
properties:
file:
type: string
format: binary
required:
- file
responses:
'202':
description: Import accepted and queued
headers:
Location:
schema:
type: string
example: https://api.example.com/imports/import_202501
content:
application/json:
schema:
$ref: '#/components/schemas/ImportStatus'
examples:
default:
value:
importId: "import_202501"
status: "QUEUED"
/imports/{importId}:
get:
summary: Get import status and results
tags: [Imports]
parameters:
- in: path
name: importId
required: true
schema:
type: string
example: "import_202501"
responses:
'200':
description: Import status
content:
application/json:
schema:
$ref: '#/components/schemas/ImportStatus'
examples:
queued:
value:
importId: "import_202501"
status: "IN_PROGRESS"
results: []
completed:
value:
importId: "import_202501"
status: "COMPLETED"
results:
- id: "cus_123"
status: "SUCCESS"
- id: "cus_456"
status: "FAILED"
errorCode: "EMAIL_INVALID"
errorMessage: "Email address is not valid."
'404':
description: Import job not found
components:
schemas:
ImportStatus:
type: object
properties:
importId:
type: string
example: "import_202501"
status:
type: string
enum: [QUEUED, IN_PROGRESS, COMPLETED, FAILED]
example: "QUEUED"
results:
type: array
items:
$ref: '#/components/schemas/ImportResultItem'
required:
- importId
- status
ImportResultItem:
type: object
properties:
id:
type: string
example: "cus_456"
status:
type: string
enum: [SUCCESS, FAILED]
example: "FAILED"
errorCode:
type: string
nullable: true
example: "EMAIL_INVALID"
errorMessage:
type: string
nullable: true
example: "Email address is not valid."
required:
- id
- status
8.7. Visualizing the Bulk / Batch / Import Pattern (Mermaid)
8.7.1. Bulk Import (Per-Item Processing)
sequenceDiagram
autonumber
participant C as Client
participant API as Customers API
participant DB as Customer Store
Note over C,API: Bulk import (per-item)
C->>API: POST /customers/bulk-import<br/>{ items: [...] }
loop For each item
API->>DB: upsert(customer)
alt valid
DB-->>API: success
else invalid
DB-->>API: validation error
end
end
API-->>C: 207 Multi-Status<br/>{ results: [SUCCESS, FAILED, ...] }
8.7.2. Batch Import (All-or-Nothing)
sequenceDiagram
autonumber
participant C as Client
participant API as Customers API
participant DB as Customer Store
Note over C,API: Batch import (all-or-nothing)
C->>API: POST /customers/batch-import<br/>{ items: [...] }
API->>DB: validate all items (no writes yet)
alt any item invalid
DB-->>API: validation errors
API-->>C: 400 Bad Request<br/>{ items:[{id,status,errors...}] }<br/>// nothing persisted
else all valid
API->>DB: write all items in one transaction
DB-->>API: success
API-->>C: 200 OK<br/>{ items:[{id,status,errors:[]}, ...] }
end
8.7.3. Async Bulk Import (Queued Processing)
sequenceDiagram
autonumber
participant C as Client
participant API as Customers API
participant Queue as Import Queue
participant W as Import Worker
participant DB as Customer Store
Note over C,API: Submit async import
C->>API: POST /customers/import (file)
API->>Queue: enqueue(import job)
API-->>C: 202 Accepted<br/>Location: /imports/import_202501<br/>{ importId, status: QUEUED }
Note over Queue,W: Background processing
Queue->>W: deliver import_202501 job
W->>DB: upsert customers in batches
DB-->>W: per-item results
W-->>API: update import status + per-item results
Note over C,API: Poll for status
C->>API: GET /imports/import_202501
API-->>C: 200 OK<br/>{ status: COMPLETED, results:[...] }