At Big D Companies, I faced a challenge familiar to many developers: a legacy PHP-based ERP system managing SCADA automation that desperately needed modernization. The system worked, but maintaining it was becoming increasingly difficult, and adding new features felt like archaeology.
The Legacy Situation
The existing system was a classic PHP monolith:
- Tight coupling between presentation and business logic
- Limited test coverage
- Performance issues from inefficient database queries
- Hard to onboard new developers
- MySQL database that needed to be preserved
The Migration Strategy
We couldn't do a "big bang" rewrite - the system was too critical. Instead, we took a gradual approach that allowed us to deliver value incrementally while maintaining business continuity.
Step 1: API Layer First
The first step was extracting the data layer behind a REST API. This allowed us to start building the new frontend while the old system continued running:
// api/routes/devices.js
const express = require('express');
const router = express.Router();
const mysql = require('mysql2/promise');
const pool = mysql.createPool({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
waitForConnections: true,
connectionLimit: 10,
});
router.get('/devices', async (req, res) => {
try {
const [rows] = await pool.query(`
SELECT
d.id,
d.name,
d.type,
d.status,
d.last_reading,
s.name as site_name
FROM devices d
LEFT JOIN sites s ON d.site_id = s.id
WHERE d.active = 1
ORDER BY d.name
`);
res.json(rows);
} catch (error) {
console.error('Database error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
router.get('/devices/:id/readings', async (req, res) => {
const { id } = req.params;
const { startDate, endDate } = req.query;
try {
const [rows] = await pool.query(`
SELECT
timestamp,
temperature,
pressure,
flow_rate
FROM device_readings
WHERE device_id = ?
AND timestamp BETWEEN ? AND ?
ORDER BY timestamp DESC
`, [id, startDate, endDate]);
res.json(rows);
} catch (error) {
console.error('Database error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
module.exports = router;Step 2: Build the React Frontend
With the API in place, we built a modern React/Next.js frontend:
// components/DeviceMonitor.tsx
import { useState, useEffect } from 'react';
import { Card, CardHeader, CardContent } from '@/components/ui/card';
import { Badge } from '@/components/ui/badge';
interface Device {
id: number;
name: string;
type: string;
status: 'online' | 'offline' | 'warning';
last_reading: string;
site_name: string;
}
export default function DeviceMonitor() {
const [devices, setDevices] = useState<Device[]>([]);
const [loading, setLoading] = useState(true);
useEffect(() => {
async function fetchDevices() {
try {
const response = await fetch('/api/devices');
const data = await response.json();
setDevices(data);
} catch (error) {
console.error('Failed to fetch devices:', error);
} finally {
setLoading(false);
}
}
fetchDevices();
// Poll for updates every 30 seconds
const interval = setInterval(fetchDevices, 30000);
return () => clearInterval(interval);
}, []);
if (loading) {
return <div>Loading devices...</div>;
}
return (
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
{devices.map((device) => (
<Card key={device.id}>
<CardHeader>
<div className="flex justify-between items-center">
<h3 className="font-semibold">{device.name}</h3>
<Badge
variant={
device.status === 'online' ? 'success' :
device.status === 'warning' ? 'warning' : 'destructive'
}
>
{device.status}
</Badge>
</div>
</CardHeader>
<CardContent>
<p className="text-sm text-gray-600">{device.site_name}</p>
<p className="text-sm">Type: {device.type}</p>
<p className="text-sm">
Last Reading: {new Date(device.last_reading).toLocaleString()}
</p>
</CardContent>
</Card>
))}
</div>
);
}Step 3: Progressive Enhancement
We didn't migrate everything at once. Instead, we used a routing strategy that gradually replaced PHP pages:
// server.js
const express = require('express');
const { createProxyMiddleware } = require('http-proxy-middleware');
const next = require('next');
const dev = process.env.NODE_ENV !== 'production';
const app = next({ dev });
const handle = app.getRequestHandler();
const PORT = process.env.PORT || 3000;
const PHP_SERVER = 'http://localhost:8080';
app.prepare().then(() => {
const server = express();
// New React routes
server.use('/dashboard', (req, res) => handle(req, res));
server.use('/devices', (req, res) => handle(req, res));
server.use('/api', (req, res) => handle(req, res));
// Proxy everything else to PHP (legacy routes)
server.use('*', createProxyMiddleware({
target: PHP_SERVER,
changeOrigin: true,
}));
server.listen(PORT, () => {
console.log(`> Ready on http://localhost:${PORT}`);
});
});Testing Strategy
Testing legacy systems is challenging. We used Jest for new code and maintained the existing test suite:
// __tests__/DeviceMonitor.test.tsx
import { render, screen, waitFor } from '@testing-library/react';
import DeviceMonitor from '@/components/DeviceMonitor';
// Mock fetch
global.fetch = jest.fn();
describe('DeviceMonitor', () => {
beforeEach(() => {
(fetch as jest.Mock).mockClear();
});
it('displays devices after loading', async () => {
(fetch as jest.Mock).mockResolvedValueOnce({
json: async () => ([
{
id: 1,
name: 'Pump 1',
type: 'Centrifugal',
status: 'online',
last_reading: '2024-04-28T10:00:00Z',
site_name: 'Site A'
}
])
});
render(<DeviceMonitor />);
await waitFor(() => {
expect(screen.getByText('Pump 1')).toBeInTheDocument();
});
expect(screen.getByText('online')).toBeInTheDocument();
});
});Performance Improvements
The migration yielded significant performance improvements:
- Database Query Optimization: Indexed queries reduced load times by 70%
- Client-Side Rendering: React's virtual DOM made the UI much snappier
- Code Splitting: Next.js automatically split code, reducing initial load
- Caching: Implemented Redis caching for frequently accessed data
CI/CD Pipeline
We used CircleCI for automated testing and deployment:
version: 2.1
jobs:
test:
docker:
- image: circleci/node:16
- image: circleci/mysql:8.0
environment:
MYSQL_ROOT_PASSWORD: testpass
MYSQL_DATABASE: test_db
steps:
- checkout
- restore_cache:
keys:
- v1-dependencies-{{ checksum "package.json" }}
- run: npm install
- run: npm test
- save_cache:
paths:
- node_modules
key: v1-dependencies-{{ checksum "package.json" }}
deploy:
docker:
- image: circleci/node:16
steps:
- checkout
- run: npm install
- run: npm run build
- run: npm run deploy
workflows:
version: 2
test-and-deploy:
jobs:
- test
- deploy:
requires:
- test
filters:
branches:
only: mainLessons Learned
- Incremental is better than all-at-once: Gradual migration reduced risk
- API-first enables flexibility: The API layer gave us freedom to iterate on the frontend
- Keep the database: We preserved MySQL, which simplified migration significantly
- Maintain the old system: We continued fixing critical bugs in PHP during migration
- Performance matters: Users loved the speed improvements
Conclusion
Modernizing legacy systems is challenging but incredibly rewarding. The key is finding a migration path that delivers value incrementally while maintaining business continuity.
The PHP system at Big D Companies now has a modern React frontend, improved performance, and a maintainable codebase - all while keeping the business running smoothly throughout the transition.

Jason Cochran
Sofware Engineer | Cloud Consultant | Founder at Strataga
27 years of experience building enterprise software for oil & gas operators and startups. Specializing in SCADA systems, field data solutions, and AI-powered rapid development. Based in Midland, TX serving the Permian Basin.