Skip to main content

Batch Operations

Overview

Batch operations perform multiple writes in efficient bulk operations. While Core APIs don't have dedicated batch endpoints, you can efficiently batch operations using parallel requests or database-level optimizations.

Strategies:

  • Parallel Requests - Multiple API calls concurrently
  • Cascade Operations - Single request creates multiple records
  • Database Batch Inserts - Internal optimization
  • Transaction Batching - Group operations in single transaction

Parallel Requests

Concurrent API Calls

JavaScript Promise.all():

async function batchCreateProducts(products) {
// Create all products in parallel
const requests = products.map(product =>
fetch(`${apiBase}/api/v4/core/PRD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ data: product })
})
);

// Wait for all to complete
const responses = await Promise.all(requests);

// Parse results
const results = await Promise.all(
responses.map(r => r.json())
);

return results;
}

// Usage
const products = [
{ XPRD01: 'Product A', XPRD02: 99.99, XPRD05: 'cat_1' },
{ XPRD01: 'Product B', XPRD02: 89.99, XPRD05: 'cat_1' },
{ XPRD01: 'Product C', XPRD02: 79.99, XPRD05: 'cat_2' }
];

const results = await batchCreateProducts(products);
console.log(`Created ${results.length} products`);

Benefits:

  • ✅ Fast (parallel execution)
  • ✅ Independent transactions (one failure doesn't affect others)
  • ✅ Simple implementation

Drawbacks:

  • ⚠️ Many API calls (network overhead)
  • ⚠️ No atomicity (partial success possible)
  • ⚠️ Rate limiting concerns

Rate Limiting

Throttle concurrent requests:

async function batchCreateWithThrottle(products, concurrency = 10) {
const results = [];

// Process in batches of N concurrent requests
for (let i = 0; i < products.length; i += concurrency) {
const batch = products.slice(i, i + concurrency);

const batchRequests = batch.map(product =>
fetch(`${apiBase}/api/v4/core/PRD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ data: product })
})
);

const batchResponses = await Promise.all(batchRequests);
const batchResults = await Promise.all(
batchResponses.map(r => r.json())
);

results.push(...batchResults);

console.log(`Processed ${results.length}/${products.length} products`);
}

return results;
}

// Usage: 100 products, 10 concurrent requests at a time
const results = await batchCreateWithThrottle(products, 10);

Cascade Operations

Single Request, Multiple Records

Use INSERT_CASCADE for master-detail:

async function createOrderWithManyItems(orderData, items) {
// Single API call creates order + all items
const response = await fetch(`${apiBase}/api/v4/core/ORD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
data: {
...orderData,
items: items // Cascade creates all items in single transaction
}
})
});

return response.json();
}

// Usage: Create order with 100 items
const order = await createOrderWithManyItems(
{
XORD_CUSTOMER_ID: 'cust_123',
XORD03: 9999.00
},
Array.from({ length: 100 }, (_, i) => ({
XORDITEM10: `prd_${i}`,
XORDITEM_QTY: 1,
XORDITEM09: 99.99
}))
);

console.log(`Created order with ${order.data.items.length} items`);

Benefits:

  • ✅ Single API call (minimal network overhead)
  • ✅ Atomic (all-or-nothing)
  • ✅ Efficient database execution

Drawbacks:

  • ⚠️ Limited to configured cascades
  • ⚠️ Request size limits (typically 1-10MB)

Bulk Update

Update Multiple Records

Promise.all() with PATCH:

async function bulkUpdateCategory(productIds, newCategory) {
const requests = productIds.map(id =>
fetch(`${apiBase}/api/v4/core/PRD/${id}`, {
method: 'PATCH',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
data: { XPRD05: newCategory }
})
})
);

const responses = await Promise.all(requests);
const results = await Promise.all(responses.map(r => r.json()));

return results;
}

// Usage: Update 50 products
const productIds = Array.from({ length: 50 }, (_, i) => 100 + i);
const results = await bulkUpdateCategory(productIds, 'cat_electronics');
console.log(`Updated ${results.length} products`);

Conditional Bulk Update

Update only if condition met:

async function bulkUpdateWithCondition(productIds, updateFn, conditionFn) {
// 1. Fetch all products
const products = await Promise.all(
productIds.map(id =>
fetch(`${apiBase}/api/v4/core/PRD/${id}`, {
headers: { 'Authorization': `Bearer ${token}` }
}).then(r => r.json())
)
);

// 2. Filter by condition
const toUpdate = products.filter(p => conditionFn(p.data));

// 3. Update filtered products
const updates = toUpdate.map(p =>
fetch(`${apiBase}/api/v4/core/PRD/${p.data.PRD_ID}`, {
method: 'PATCH',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
data: updateFn(p.data)
})
})
);

const results = await Promise.all(updates);
console.log(`Updated ${results.length}/${products.length} products`);

return results;
}

// Usage: Increase price by 10% for products below $100
await bulkUpdateWithCondition(
productIds,
(product) => ({
XPRD02: product.XPRD02 * 1.1 // +10%
}),
(product) => product.XPRD02 < 100 // Condition
);

Bulk Delete

Soft Delete Multiple Records

async function bulkSoftDelete(dimension, recordIds) {
const requests = recordIds.map(id =>
fetch(`${apiBase}/api/v4/core/${dimension}/${id}`, {
method: 'DELETE',
headers: { 'Authorization': `Bearer ${token}` }
})
);

const responses = await Promise.all(requests);
const results = await Promise.all(responses.map(r => r.json()));

return results;
}

// Usage: Soft delete 20 products
const productIds = [123, 124, 125, /*...*/];
const results = await bulkSoftDelete('PRD', productIds);
console.log(`Soft deleted ${results.length} products`);

Error Handling in Batch Operations

Partial Success Handling

async function batchCreateWithErrorHandling(products) {
const results = {
succeeded: [],
failed: []
};

const requests = products.map(async (product, index) => {
try {
const response = await fetch(`${apiBase}/api/v4/core/PRD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ data: product })
});

if (response.ok) {
const result = await response.json();
results.succeeded.push({
index,
product,
result: result.data
});
} else {
const error = await response.json();
results.failed.push({
index,
product,
error: error.message,
code: error.code
});
}
} catch (error) {
results.failed.push({
index,
product,
error: error.message,
code: 'NETWORK_ERROR'
});
}
});

await Promise.all(requests);

console.log(`Success: ${results.succeeded.length}, Failed: ${results.failed.length}`);

return results;
}

// Usage
const results = await batchCreateWithErrorHandling(products);

if (results.failed.length > 0) {
console.error('Failed products:', results.failed);
// Optionally retry failed items
await retryFailed(results.failed);
}

Retry Failed Operations

async function retryFailed(failedItems, maxRetries = 3) {
const retryResults = [];

for (const item of failedItems) {
let attempt = 0;
let success = false;

while (attempt < maxRetries && !success) {
attempt++;
console.log(`Retry ${attempt}/${maxRetries} for item ${item.index}`);

try {
const response = await fetch(`${apiBase}/api/v4/core/PRD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ data: item.product })
});

if (response.ok) {
const result = await response.json();
retryResults.push({ success: true, item, result });
success = true;
} else {
const error = await response.json();
if (attempt === maxRetries) {
retryResults.push({ success: false, item, error });
}
}

// Exponential backoff
if (!success && attempt < maxRetries) {
await sleep(Math.pow(2, attempt) * 1000); // 2s, 4s, 8s
}
} catch (error) {
if (attempt === maxRetries) {
retryResults.push({ success: false, item, error: error.message });
}
}
}
}

return retryResults;
}

function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}

Progress Tracking

Batch with Progress Updates

async function batchWithProgress(products, onProgress) {
const results = [];
const total = products.length;

for (let i = 0; i < total; i++) {
const product = products[i];

try {
const response = await fetch(`${apiBase}/api/v4/core/PRD`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ data: product })
});

const result = await response.json();
results.push(result);

// Update progress
onProgress({
current: i + 1,
total,
percentage: Math.round(((i + 1) / total) * 100),
product: result.data
});
} catch (error) {
results.push({ error: error.message });
onProgress({
current: i + 1,
total,
percentage: Math.round(((i + 1) / total) * 100),
error: error.message
});
}
}

return results;
}

// Usage
const results = await batchWithProgress(products, (progress) => {
console.log(`Progress: ${progress.percentage}% (${progress.current}/${progress.total})`);
updateProgressBar(progress.percentage);
});

Performance Considerations

Batch Size Recommendations

OperationBatch SizeConcurrencyDuration
Simple POST100-100010-20~10s
POST with CASCADE10-1005-10~5-10s
PATCH100-50010-20~5-10s
Soft DELETE100-50010-20~5-10s

Factors:

  • Network latency
  • Database load
  • Record complexity
  • Cascade operations
  • Validation complexity

Memory Considerations

Avoid loading all results in memory:

// ❌ Bad - loads 10,000 results in memory
async function batchCreateBad(products) {
const results = await Promise.all(
products.map(p => createProduct(p))
);
return results; // 10k objects in memory
}

// ✅ Good - processes in chunks
async function batchCreateGood(products, chunkSize = 100) {
let processedCount = 0;

for (let i = 0; i < products.length; i += chunkSize) {
const chunk = products.slice(i, i + chunkSize);
const results = await Promise.all(
chunk.map(p => createProduct(p))
);

processedCount += results.length;
console.log(`Processed: ${processedCount}/${products.length}`);

// Results not accumulated in memory
}

return { total: processedCount };
}

Best Practices

✅ DO:

Use parallel requests for independent operations:

// ✅ Good - parallel
const results = await Promise.all(
products.map(p => createProduct(p))
);

Use cascades for related records:

// ✅ Good - single request
await createOrder({
...orderData,
items: manyItems // Cascade
});

Handle partial failures:

// ✅ Good - track success/failure
const { succeeded, failed } = await batchWithErrorHandling(products);

Throttle concurrent requests:

// ✅ Good - 10 concurrent max
await batchWithThrottle(products, 10);

❌ DON'T:

Don't send unbounded parallel requests:

// ❌ Bad - 10,000 concurrent requests
await Promise.all(
tenThousandProducts.map(p => createProduct(p))
);

Don't ignore errors:

// ❌ Bad - silent failures
await Promise.all(
products.map(p => createProduct(p).catch(() => {}))
);

Don't batch unrelated operations:

// ❌ Bad - mixing different operations
await Promise.all([
createProduct(p1),
updateOrder(o1),
deleteCustomer(c1)
]);

Summary

  • ✅ Use parallel requests for independent operations
  • ✅ Use CASCADE for related records (single transaction)
  • ✅ Throttle concurrent requests (10-20 max)
  • ✅ Handle partial failures gracefully
  • ✅ Track progress for long-running batches
  • ✅ Retry failed operations with exponential backoff
  • ✅ Process in chunks to avoid memory issues

Key Takeaways:

  1. Parallel requests: Fast but independent transactions
  2. Cascades: Single transaction, atomic operation
  3. Throttle concurrency to avoid overload
  4. Handle partial success/failure
  5. Retry with exponential backoff
  6. Track progress for user feedback
  7. Process large batches in chunks

Batch Strategy Decision:

ScenarioStrategy
Create order with 100 itemsCASCADE (single request)
Create 100 independent productsParallel requests (throttled)
Update 500 productsParallel PATCH (chunks of 50)
Soft delete 200 recordsParallel DELETE (chunks of 50)
Create 10,000 productsChunked parallel (100 at a time)