Skip to main content

Migration Guide

Overview

Step-by-step guide for migrating from legacy systems to Q01 Core APIs. This guide covers assessment, planning, execution strategies, validation, and rollback procedures.

Migration Phases:

  1. Assessment & Planning
  2. Preparation & Setup
  3. Data Migration
  4. Application Migration
  5. Validation & Testing
  6. Cutover & Rollback

Phase 1: Assessment & Planning

Inventory Legacy System

Identify:

  • Data structures and relationships
  • Business logic and rules
  • Integration points
  • User workflows
  • Performance requirements
  • Security requirements

Document:

# Legacy System Inventory

## Data Structures
- Products table: 50,000 records
- Categories table: 100 records
- Orders table: 500,000 records
- Customers table: 25,000 records

## Relationships
- Product → Category (many-to-one)
- Order → Customer (many-to-one)
- Order → OrderLine → Product (one-to-many)

## Business Logic
- Price calculations (discounts, taxes)
- Inventory management
- Order workflow (pending → confirmed → shipped)

## Integration Points
- Payment gateway (Stripe)
- Shipping provider (UPS API)
- Email service (SendGrid)
- Analytics (Google Analytics)

## Performance
- Current: 100 req/sec peak
- Target: 200 req/sec peak
- Response time: < 200ms p95

## Security
- User authentication (custom)
- Role-based permissions
- SSL/TLS encryption

Map to Core API Dimensions

Mapping table:

Legacy TableCore DimensionNotes
productsPRDMap product fields to XPRD* variables
categoriesCATMap category fields to XCAT* variables
ordersORDMap order fields to XORD* variables
order_linesORDLNMap order line fields to XORDLN* variables
customersCUSTMap customer fields to XCUST* variables

Field mapping:

// Legacy → Core API field mapping
const fieldMapping = {
products: {
id: 'PRD_ID',
name: 'XPRD01',
price: 'XPRD02',
sku: 'XPRD03',
slug: 'XPRD04',
category_id: 'XPRD05',
active: 'XPRD06' // 1/0 → Y/N
},
categories: {
id: 'CAT_ID',
name: 'XCAT01',
description: 'XCAT02'
},
orders: {
id: 'ORD_ID',
customer_id: 'XORD_CUSTOMER_ID',
order_date: 'XORD_DATE',
status: 'XORD_STATUS',
total: 'XORD_TOTAL'
}
};

Create Migration Plan

Timeline:

Week 1-2:  Assessment & Planning
Week 3-4: Setup & Preparation
Week 5-6: Data Migration (Phase 1 - Reference Data)
Week 7-8: Data Migration (Phase 2 - Transactional Data)
Week 9-10: Application Migration (Backend)
Week 11-12: Application Migration (Frontend)
Week 13: Testing & Validation
Week 14: Cutover & Monitoring

Risk Assessment:

RiskImpactProbabilityMitigation
Data lossHighLowDual-write, validation
DowntimeHighMediumBlue-green deployment
Performance degradationMediumMediumLoad testing, monitoring
Data inconsistencyHighMediumReconciliation jobs
Rollback neededHighLowKeep legacy system running

Phase 2: Preparation & Setup

Setup Core API Access

# 1. Get API credentials
export CORE_API_URL="https://api.q01.io"
export CORE_API_TOKEN="your-token-here"

# 2. Configure context
export SOURCE="tenant_123"
export PESO="3"
export AMBIENTE="P"
export CENTRO_DETT="HQ"

# 3. Test connectivity
curl -H "Authorization: Bearer $CORE_API_TOKEN" \
-H "X-Source: $SOURCE" \
-H "X-Peso: $PESO" \
$CORE_API_URL/api/v4/core/products?limit=1

Configure Metadata

Create dimensions in TB_DIM:

-- Already exists in Q01, verify:
SELECT DIM_ID, DIM_CODE, DIM_NAME
FROM TB_DIM
WHERE DIM_CODE IN ('PRD', 'CAT', 'ORD', 'ORDLN', 'CUST');

Configure variables in TB_VAR:

-- Verify field configurations
SELECT VAR_ID, DIM_ID, VAR, VAR_TYPE, COD_ON_OFF, REQUIRED
FROM TB_VAR
WHERE DIM_ID IN (SELECT DIM_ID FROM TB_DIM WHERE DIM_CODE = 'PRD')
ORDER BY VAR;

Setup Migration Tools

// migration-tool.js
const {Pool} = require('pg');
const axios = require('axios');

class MigrationTool {
constructor(legacyDB, coreAPI) {
this.legacyDB = legacyDB;
this.coreAPI = coreAPI;
this.stats = {
total: 0,
migrated: 0,
failed: 0,
errors: []
};
}

async migrate(tableName, dimension, transformer, batchSize = 100) {
console.log(`Migrating ${tableName}${dimension}...`);

// Get total count
const {rows: [{count}]} = await this.legacyDB.query(
`SELECT COUNT(*) as count FROM ${tableName}`
);
this.stats.total = parseInt(count);

// Process in batches
let offset = 0;
while (offset < this.stats.total) {
const batch = await this.fetchBatch(tableName, offset, batchSize);
const transformed = batch.map(transformer);

try {
await this.coreAPI.post(`/api/v4/core/${dimension}/batch`, {
records: transformed
});

this.stats.migrated += batch.length;
console.log(`Progress: ${this.stats.migrated}/${this.stats.total}`);
} catch (error) {
this.stats.failed += batch.length;
this.stats.errors.push({
offset,
count: batch.length,
error: error.message
});
}

offset += batchSize;
}

return this.stats;
}

async fetchBatch(tableName, offset, limit) {
const {rows} = await this.legacyDB.query(
`SELECT * FROM ${tableName} ORDER BY id LIMIT $1 OFFSET $2`,
[limit, offset]
);
return rows;
}
}

module.exports = MigrationTool;

Phase 3: Data Migration

Strategy 1: One-Time Bulk Migration

Best for: Reference data, historical data

// migrate-products.js
const MigrationTool = require('./migration-tool');

async function migrateProducts() {
const tool = new MigrationTool(legacyDB, coreAPI);

const stats = await tool.migrate('products', 'products', (legacy) => ({
XPRD01: legacy.name,
XPRD02: legacy.price,
XPRD03: legacy.sku,
XPRD04: slugify(legacy.name),
XPRD05: legacy.category_id,
XPRD06: legacy.active ? 'Y' : 'N',
TREC: 'N'
}), 100);

console.log('Migration complete:', stats);
}

migrateProducts().catch(console.error);

Strategy 2: Dual-Write Pattern

Best for: Active transactional data

// dual-write-service.js
class DualWriteService {
constructor(legacyDB, coreAPI) {
this.legacyDB = legacyDB;
this.coreAPI = coreAPI;
}

async createProduct(data) {
let legacyId, coreId;

try {
// 1. Write to legacy system
const legacyResult = await this.legacyDB.query(
`INSERT INTO products (name, price, sku, category_id, active)
VALUES ($1, $2, $3, $4, $5) RETURNING id`,
[data.name, data.price, data.sku, data.category_id, data.active]
);
legacyId = legacyResult.rows[0].id;

// 2. Write to Core API
const coreResult = await this.coreAPI.post('/api/v4/core/products', {
XPRD01: data.name,
XPRD02: data.price,
XPRD03: data.sku,
XPRD05: data.category_id,
XPRD06: data.active ? 'Y' : 'N',
XPRD_LEGACY_ID: legacyId // Store legacy ID for reconciliation
});
coreId = coreResult.PRD_ID;

// 3. Update legacy with Core ID
await this.legacyDB.query(
`UPDATE products SET core_id = $1 WHERE id = $2`,
[coreId, legacyId]
);

return {legacyId, coreId};
} catch (error) {
// Rollback on failure
if (legacyId && !coreId) {
await this.legacyDB.query(`DELETE FROM products WHERE id = $1`, [legacyId]);
}
throw error;
}
}

async updateProduct(id, data) {
// Update both systems
await Promise.all([
this.legacyDB.query(
`UPDATE products SET name = $1, price = $2 WHERE id = $3`,
[data.name, data.price, id]
),
this.coreAPI.patch(`/api/v4/core/products/${id}`, {
XPRD01: data.name,
XPRD02: data.price
})
]);
}

async deleteProduct(id) {
// Delete from both systems
await Promise.all([
this.legacyDB.query(`DELETE FROM products WHERE id = $1`, [id]),
this.coreAPI.delete(`/api/v4/core/products/${id}`)
]);
}
}

Strategy 3: Incremental Migration

Best for: Large datasets, gradual transition

// incremental-migration.js
class IncrementalMigration {
async migrateByDateRange(startDate, endDate) {
console.log(`Migrating records from ${startDate} to ${endDate}...`);

const records = await legacyDB.query(
`SELECT * FROM products
WHERE created_at BETWEEN $1 AND $2
ORDER BY created_at`,
[startDate, endDate]
);

for (const record of records.rows) {
try {
await coreAPI.post('/api/v4/core/products', transform(record));
} catch (error) {
console.error(`Failed to migrate product ${record.id}:`, error);
}
}
}

async migrateInWeeks() {
const weeks = [
['2023-01-01', '2023-01-07'],
['2023-01-08', '2023-01-14'],
['2023-01-15', '2023-01-21'],
// ... continue for all weeks
];

for (const [start, end] of weeks) {
await this.migrateByDateRange(start, end);
await sleep(5000); // Pause between batches
}
}
}

Phase 4: Data Validation

Reconciliation Script

// reconciliation.js
class DataReconciliation {
async reconcile(tableName, dimension) {
console.log(`Reconciling ${tableName}${dimension}...`);

const discrepancies = [];

// Fetch all legacy records
const {rows: legacyRecords} = await legacyDB.query(
`SELECT * FROM ${tableName} WHERE core_id IS NOT NULL`
);

for (const legacy of legacyRecords) {
// Fetch corresponding Core API record
const core = await coreAPI.get(`/api/v4/core/${dimension}/${legacy.core_id}`);

// Compare fields
if (legacy.name !== core.XPRD01) {
discrepancies.push({
id: legacy.id,
field: 'name',
legacy: legacy.name,
core: core.XPRD01
});
}

if (legacy.price !== core.XPRD02) {
discrepancies.push({
id: legacy.id,
field: 'price',
legacy: legacy.price,
core: core.XPRD02
});
}
}

if (discrepancies.length > 0) {
console.error(`Found ${discrepancies.length} discrepancies`);
console.table(discrepancies);
} else {
console.log('✅ All records match!');
}

return discrepancies;
}
}

Phase 5: Application Migration

Backend Migration

// Before: Direct database access
const getProducts = async () => {
const result = await db.query('SELECT * FROM products WHERE active = true');
return result.rows;
};

// After: Core API
const getProducts = async () => {
const response = await coreAPI.get('/api/v4/core/products', {
params: {filters: 'XPRD06:eq:Y,TREC:eq:N'}
});
return response.data;
};
// Before: Direct INSERT
const createProduct = async (data) => {
const result = await db.query(
'INSERT INTO products (name, price) VALUES ($1, $2) RETURNING *',
[data.name, data.price]
);
return result.rows[0];
};

// After: Core API
const createProduct = async (data) => {
const response = await coreAPI.post('/api/v4/core/products', {
XPRD01: data.name,
XPRD02: data.price
});
return response.data;
};

Frontend Migration

// Before: Direct API calls
const fetchProducts = async () => {
const response = await fetch('/api/products');
return response.json();
};

// After: Core API via backend
const fetchProducts = async () => {
const response = await fetch('/api/v4/core/products');
return response.json();
};

Phase 6: Cutover & Rollback

Blue-Green Deployment

// deployment-strategy.js
class BlueGreenDeployment {
async cutover() {
// 1. Deploy new version (green) alongside old (blue)
console.log('Deploying green environment...');
await deployGreen();

// 2. Run smoke tests on green
console.log('Running smoke tests...');
const testsPass = await runSmokeTests('green');

if (!testsPass) {
console.error('Smoke tests failed, staying on blue');
return false;
}

// 3. Switch traffic gradually (canary)
console.log('Switching 10% traffic to green...');
await switchTraffic({green: 10, blue: 90});
await sleep(300000); // Monitor for 5 minutes

console.log('Switching 50% traffic to green...');
await switchTraffic({green: 50, blue: 50});
await sleep(300000);

console.log('Switching 100% traffic to green...');
await switchTraffic({green: 100, blue: 0});

// 4. Keep blue running for rollback
console.log('Green is now active, blue on standby');

return true;
}

async rollback() {
console.log('Rolling back to blue...');
await switchTraffic({green: 0, blue: 100});
console.log('Rollback complete');
}
}

Monitoring During Cutover

// monitor-cutover.js
class CutoverMonitor {
async monitor(durationMinutes = 60) {
const alerts = [];
const startTime = Date.now();

while (Date.now() - startTime < durationMinutes * 60000) {
// Check error rate
const errorRate = await this.getErrorRate();
if (errorRate > 5) {
alerts.push({
time: new Date(),
type: 'error_rate',
value: errorRate,
threshold: 5
});
}

// Check response time
const p95ResponseTime = await this.getP95ResponseTime();
if (p95ResponseTime > 500) {
alerts.push({
time: new Date(),
type: 'response_time',
value: p95ResponseTime,
threshold: 500
});
}

// Check data consistency
const inconsistencies = await this.checkDataConsistency();
if (inconsistencies > 0) {
alerts.push({
time: new Date(),
type: 'data_inconsistency',
count: inconsistencies
});
}

await sleep(60000); // Check every minute
}

if (alerts.length > 0) {
console.error('⚠️ Alerts during cutover:');
console.table(alerts);
return false;
}

console.log('✅ Cutover successful, no alerts');
return true;
}
}

Migration Checklist

Pre-Migration

  • Legacy system inventory complete
  • Field mapping documented
  • Migration plan approved
  • Core API access configured
  • Metadata verified
  • Migration tools tested
  • Rollback plan documented

During Migration

  • Data migrated successfully
  • Validation passed
  • Application updated
  • Tests passing
  • Performance acceptable
  • Monitoring in place

Post-Migration

  • All traffic on Core API
  • Legacy system decommissioned (or on standby)
  • Documentation updated
  • Team trained
  • Support processes updated
  • Lessons learned documented

Summary

Key Success Factors:

  1. ✅ Thorough planning and assessment
  2. ✅ Incremental migration (not big bang)
  3. ✅ Dual-write for active data
  4. ✅ Comprehensive validation
  5. ✅ Gradual cutover with monitoring
  6. ✅ Keep rollback option available

Common Pitfalls:

  • ❌ Insufficient testing
  • ❌ No rollback plan
  • ❌ Poor data validation
  • ❌ Big bang cutover
  • ❌ Inadequate monitoring