Batch Operations
Overview
Batch operations enable creating multiple records in a single HTTP request within a single database transaction. This provides atomicity (all succeed or all fail), improved performance, and simplified error handling.
Key Characteristics:
- Multiple records in one POST request
- Single ACID transaction
- All-or-nothing semantics (rollback on any failure)
- Shared metadata (counters, timestamps)
- Cascade operations execute for all records
- Single outbox event for entire batch
- Better performance than individual requests
Why Batch Operations?
Performance Benefits
Individual Requests (Inefficient):
POST /api/v4/core/PRD (Product 1) → 200ms
POST /api/v4/core/PRD (Product 2) → 200ms
POST /api/v4/core/PRD (Product 3) → 200ms
Total: 600ms + network overhead
Batch Request (Efficient):
POST /api/v4/core/PRD (Products 1-3) → 250ms
Total: 250ms
Improvements:
- ✅ Fewer database connections
- ✅ Single transaction overhead
- ✅ Reduced network latency
- ✅ Lower server load
Atomicity Guarantee
Problem: Creating related records individually
// ❌ Risk: Order created but items fail
await createOrder(order); // Success
await createOrderItem(item1); // Success
await createOrderItem(item2); // FAILS
// Result: Orphaned order without items!
Solution: Batch operation with transaction
// ✅ All succeed or all fail
await createOrderWithItems([order, item1, item2]);
// Result: Complete order with items or nothing
Endpoint
Method: POST
Path: /api/v4/core/{DIM}
Same endpoint as single create, but body is an array
Request Format
Basic Batch Create
Single Record (existing pattern):
{
"XPRD01": "Widget A",
"XPRD02": 29.99,
"source": "productCreate"
}
Multiple Records (batch pattern):
[
{
"XPRD01": "Widget A",
"XPRD02": 29.99,
"source": "productCreate"
},
{
"XPRD01": "Widget B",
"XPRD02": 39.99,
"source": "productCreate"
},
{
"XPRD01": "Widget C",
"XPRD02": 49.99,
"source": "productCreate"
}
]
Critical: Array of objects, each with required fields + source parameter
Batch with Shared Metadata
Use Case: Apply shared configuration to all records (e.g., sequential counters)
Format:
[
{
"metadata": true,
"HD_DIM": {
"counter": "XPRD_BATCH_ID"
}
},
{
"XPRD01": "Widget A",
"XPRD02": 29.99,
"source": "productCreate"
},
{
"XPRD01": "Widget B",
"XPRD02": 39.99,
"source": "productCreate"
}
]
First element with "metadata": true applies to all subsequent records:
- Shared counter configuration
- Batch-level settings
- Header-detail relationships
Response Format
Success Response
[
{
"code": 201,
"insertedId": "123",
"body": {
"XPRD01": "Widget A",
"XPRD02": 29.99,
"OWNER": "user123",
"CDATA": "20251219143000",
"TREC": "N"
},
"fields": {
"PRD_ID": {...},
"XPRD01": {...}
}
},
{
"code": 201,
"insertedId": "124",
"body": {
"XPRD01": "Widget B",
"XPRD02": 39.99,
"OWNER": "user123",
"CDATA": "20251219143000",
"TREC": "N"
},
"fields": {
"PRD_ID": {...},
"XPRD01": {...}
}
},
{
"code": 201,
"insertedId": "125",
"body": {
"XPRD01": "Widget C",
"XPRD02": 49.99,
"OWNER": "user123",
"CDATA": "20251219143000",
"TREC": "N"
},
"fields": {
"PRD_ID": {...},
"XPRD01": {...}
}
}
]
Array of results - one per input record
Error Response
Validation Error (400):
{
"code": 400,
"message": "Validation failed for record at index 1",
"errors": [
{
"index": 1,
"field": "XPRD01",
"message": "Field is required"
}
]
}
Transaction Rolled Back: No records created if any validation fails
Transaction Flow
Batch Create Transaction
1. Client sends batch request
↓
2. CoreWrite receives array of records
↓
3. Validate all records (required fields, types, COD_ON_OFF)
↓
4. BEGIN TRANSACTION
↓
5. For each record:
- Execute pre-insert functions
- INSERT into TB_ANAG_{DIM}00
- Execute INSERT_CASCADE
- Add to outbox batch
↓
6. Write batch to TB_ANAG_OUTBOX00
↓
7. COMMIT TRANSACTION
↓
8. Return array of insertedIds
If ANY step fails → ROLLBACK entire transaction
ACID Properties:
- Atomicity: All records created or none
- Consistency: Metadata constraints enforced
- Isolation: No partial state visible to other transactions
- Durability: Committed records persisted
JavaScript Implementation
Basic Batch Creator
class BatchCreator {
constructor(apiBase, token) {
this.apiBase = apiBase;
this.token = token;
}
async createBatch(dimension, records) {
// Validate records
if (!Array.isArray(records) || records.length === 0) {
throw new Error('Records must be non-empty array');
}
// Ensure all records have source parameter
const recordsWithSource = records.map(record => ({
...record,
source: record.source || 'batchCreate'
}));
const response = await fetch(
`${this.apiBase}/api/v4/core/${dimension}`,
{
method: 'POST',
headers: {
'Authorization': `Bearer ${this.token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify(recordsWithSource)
}
);
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || 'Batch create failed');
}
return response.json();
}
async createWithMetadata(dimension, records, metadata) {
const batch = [
{ metadata: true, ...metadata },
...records
];
return this.createBatch(dimension, batch);
}
}
// Usage
const batchCreator = new BatchCreator(apiBase, token);
const products = [
{ XPRD01: 'Widget A', XPRD02: 29.99 },
{ XPRD01: 'Widget B', XPRD02: 39.99 },
{ XPRD01: 'Widget C', XPRD02: 49.99 }
];
const results = await batchCreator.createBatch('PRD', products);
console.log(`Created ${results.length} products`);
Order with Line Items (Header-Detail Pattern)
class OrderBatchCreator {
constructor(apiBase, token) {
this.batchCreator = new BatchCreator(apiBase, token);
}
async createOrderWithItems(order, items) {
// Step 1: Create order
const orderResults = await this.batchCreator.createBatch('ORD', [order]);
const orderId = orderResults[0].insertedId;
// Step 2: Create items with order reference
const itemsWithOrderId = items.map(item => ({
...item,
XORDITEM08: orderId
}));
const itemResults = await this.batchCreator.createBatch(
'ORDITEM',
itemsWithOrderId
);
return {
order: orderResults[0],
items: itemResults
};
}
async createCompleteOrder(orderData) {
const { customer, shippingAddress, items } = orderData;
try {
// Create order header
const order = {
XORD01: customer.name,
XORD09: customer.id,
XORD10: shippingAddress,
XORD03: items.reduce((sum, item) => sum + item.price * item.qty, 0)
};
// Create order with items
const result = await this.createOrderWithItems(order, items);
return result;
} catch (error) {
console.error('Order creation failed:', error);
throw error;
}
}
}
// Usage
const orderCreator = new OrderBatchCreator(apiBase, token);
const result = await orderCreator.createCompleteOrder({
customer: { id: '456', name: 'ACME Corp' },
shippingAddress: '123 Main St',
items: [
{ XORDITEM01: 'Widget A', price: 29.99, qty: 2 },
{ XORDITEM01: 'Widget B', price: 39.99, qty: 1 }
]
});
console.log(`Order ${result.order.insertedId} created with ${result.items.length} items`);
CSV Import with Batch
class CSVImporter {
constructor(apiBase, token) {
this.batchCreator = new BatchCreator(apiBase, token);
this.batchSize = 100; // Create 100 records at a time
}
parseCsv(csvText) {
const lines = csvText.split('\n');
const headers = lines[0].split(',').map(h => h.trim());
return lines.slice(1).map(line => {
const values = line.split(',');
const record = {};
headers.forEach((header, index) => {
record[header] = values[index]?.trim();
});
return record;
});
}
async importCsv(dimension, csvText, sourceArea) {
const records = this.parseCsv(csvText);
// Split into batches
const batches = [];
for (let i = 0; i < records.length; i += this.batchSize) {
batches.push(records.slice(i, i + this.batchSize));
}
const results = [];
let successCount = 0;
let failedBatches = [];
// Process batches sequentially
for (let i = 0; i < batches.length; i++) {
try {
const batch = batches[i].map(record => ({
...record,
source: sourceArea
}));
const batchResults = await this.batchCreator.createBatch(
dimension,
batch
);
results.push(...batchResults);
successCount += batchResults.length;
console.log(`Batch ${i + 1}/${batches.length} completed: ${batchResults.length} records`);
} catch (error) {
console.error(`Batch ${i + 1} failed:`, error);
failedBatches.push({ batchIndex: i, error: error.message });
}
}
return {
total: records.length,
success: successCount,
failed: records.length - successCount,
failedBatches,
results
};
}
}
// Usage
const importer = new CSVImporter(apiBase, token);
const csvData = `
XPRD01,XPRD02,XPRD05
Widget A,29.99,electronics
Widget B,39.99,electronics
Widget C,49.99,toys
`.trim();
const result = await importer.importCsv('PRD', csvData, 'productImport');
console.log(`Imported ${result.success}/${result.total} products`);
Best Practices
✅ DO
Use appropriate batch sizes:
// ✅ Good - 50-100 records per batch
const BATCH_SIZE = 100;
// ❌ Bad - too large, may timeout
const BATCH_SIZE = 10000;
Validate before sending:
// ✅ Validate records client-side first
function validateRecord(record) {
if (!record.XPRD01) throw new Error('XPRD01 required');
if (typeof record.XPRD02 !== 'number') throw new Error('XPRD02 must be number');
}
products.forEach(validateRecord);
await batchCreator.createBatch('PRD', products);
Handle partial failures with batching:
// ✅ Process in smaller batches to isolate failures
for (const batch of batches) {
try {
await createBatch(batch);
} catch (error) {
logFailedBatch(batch, error);
// Continue with next batch
}
}
Use transactions for related records:
// ✅ Use batch for atomicity
await createOrderWithItems(order, items);
// Order and items created together or not at all
❌ DON'T
Don't send huge batches:
// ❌ Wrong - may exceed request size limit or timeout
await createBatch('PRD', allProducts); // 10,000 records
// ✅ Correct - split into reasonable batches
for (const batch of chunk(allProducts, 100)) {
await createBatch('PRD', batch);
}
Don't ignore failed batches:
// ❌ Wrong - silent failure
try {
await createBatch('PRD', products);
} catch (error) {
// Ignored
}
// ✅ Correct - log and retry
try {
await createBatch('PRD', products);
} catch (error) {
logError('Batch failed', { products, error });
await retryBatch(products);
}
Don't mix unrelated records:
// ❌ Wrong - unrelated records in same transaction
await createBatch('PRD', [
{ XPRD01: 'Widget' },
{ XPRD01: 'Unrelated Product' }
]);
// ✅ Correct - batch related records only
await createBatch('PRD', relatedProducts);
Error Handling
Validation Errors
Request:
[
{ "XPRD01": "Widget A", "XPRD02": 29.99, "source": "productCreate" },
{ "XPRD02": 39.99, "source": "productCreate" }, // Missing XPRD01
{ "XPRD01": "Widget C", "XPRD02": "invalid", "source": "productCreate" } // Invalid type
]
Response (400):
{
"code": 400,
"message": "Batch validation failed",
"errors": [
{
"index": 1,
"field": "XPRD01",
"message": "Required field missing"
},
{
"index": 2,
"field": "XPRD02",
"message": "Expected number, got string"
}
]
}
No records created - transaction rolled back
Permission Errors
Response (403):
{
"code": 403,
"message": "Insufficient permissions for batch create",
"required_grant": 4,
"user_grant": 2
}
Transaction Timeout
Response (500):
{
"code": 500,
"message": "Transaction timeout: Batch too large or slow inserts",
"suggestion": "Reduce batch size and retry"
}
Use Case: Bulk Product Import
class ProductBulkImporter {
constructor(apiBase, token) {
this.batchCreator = new BatchCreator(apiBase, token);
this.batchSize = 50;
}
async importProducts(products, onProgress) {
const batches = this.createBatches(products, this.batchSize);
const results = {
total: products.length,
success: 0,
failed: 0,
errors: []
};
for (let i = 0; i < batches.length; i++) {
try {
const batchResults = await this.batchCreator.createBatch(
'PRD',
batches[i]
);
results.success += batchResults.length;
if (onProgress) {
onProgress({
processed: (i + 1) * this.batchSize,
total: products.length,
percentage: Math.round(((i + 1) / batches.length) * 100)
});
}
} catch (error) {
results.failed += batches[i].length;
results.errors.push({
batchIndex: i,
records: batches[i],
error: error.message
});
}
}
return results;
}
createBatches(items, size) {
const batches = [];
for (let i = 0; i < items.length; i += size) {
batches.push(items.slice(i, i + size));
}
return batches;
}
}
// Usage with progress tracking
const importer = new ProductBulkImporter(apiBase, token);
const products = [
/* ... 1000 products ... */
];
const results = await importer.importProducts(products, (progress) => {
console.log(`Progress: ${progress.percentage}% (${progress.processed}/${progress.total})`);
updateProgressBar(progress.percentage);
});
console.log(`Import complete: ${results.success} success, ${results.failed} failed`);
if (results.errors.length > 0) {
console.log('Failed batches:', results.errors);
}
Summary
Batch operations provide:
- ✅ Multiple records in single request
- ✅ ACID transaction guarantees
- ✅ All-or-nothing semantics
- ✅ Better performance than individual requests
- ✅ Simplified error handling
- ✅ Shared metadata support
- ✅ Single outbox event for batch
Key Takeaways:
- Send array of records to same POST endpoint
- All records created in single transaction
- Use reasonable batch sizes (50-100 records)
- Validate records client-side before sending
- Handle failed batches gracefully
- Use batches for related records (atomicity)
- Optional metadata element at index [0] for batch configuration
Phase 5.3 Complete! All API Operations documented.
Next: Query Patterns →
Related Concepts
- Write Operations - Single record create pattern
- Entity Lifecycle - TREC states for batch records
- Outbox Pattern - Batch event publishing
- Cascades - Automatic child record creation in batches