Fallback Image

Strategies, Pitfalls, and Two Smart Concepts

Data Migration Projects with Laravel

Now on video: Phillip Kalusek’s talk at the Laravel Meetup

Data migrations are often a tricky subject – and not because of schema changes, but because of moving customer data that’s sometimes decades old and comes from a mix of legacy systems. At the Laravel Meetup, PHP developer Phillip shared his experience from numerous projects – and demonstrated how to handle such migrations in a technically sound and sustainable way using Laravel.

You’ll find the full recording below – here’s a quick overview of what to expect:

What’s it all about?

First, Phillip clears up a common misconception:

This isn’t about schema migrations, which are used to change the structure of a database. It’s about migrating customer data from various old systems into a new Laravel application – including a new data structure, new requirements, and all the baggage that comes with legacy data. And surprisingly, there’s very little practical guidance out there on how to do this well.


And this isn’t a niche issue: Studies show that over 80% of all data migration projects run into serious trouble (source: LumenData) – often resulting in significant budget overruns and missed deadlines.

The five most common issues that cause migration projects to fail:

1. Different business logic between the old and new systems

2. Inconsistent data sources (API, CSV, dumps, direct DB access)

3. Data inconsistencies that only surface late in the process

4. Old data vs. new requirements – often a poor fit

5. Not enough budget or time allocated for the migration itself

Tips & Strategies from Real-World Projects

Phillip shares concrete, reusable tips for implementing data migrations with Laravel:

SQL first: Direct access to structured data means better control

Chunking over offset: For cleaner, more performant imports

Lazy import for small data sets: Conserves resources

Choose your error strategy deliberately: “Revert on fail,” “Skip on fail,” or “Rollback on error”

Action pattern for transformations: Reusable in jobs, commands, or the UI

And then it goes one level deeper:

In addition to these general best practices, Phillip introduces two well-thought-out concepts that have proven especially effective in real-world scenarios – and that you can easily adapt to your own upcoming projects:

Concept 1: The Normalization Manager

A pattern for structured data transformation, inspired by Aaron Francis

The core idea: a manager orchestrates various normalizers, each responsible for cleaning up and standardizing individual fields (e.g., name, date, status).

Advantages:

Scalable and versionable transformation

Reusable across other projects

Fine-grained solution: Each individual component serves one clearly defined purpose.

Concept 2: The Intermediate Staging Tool

When legacy data is only accessible via an API, Phillip recommends using an intermediate system – for example, with Laravel Zero.

Instead of importing the data directly into the application, it’s first written to a SQLite staging database – unprocessed, but complete. This approach offers major advantages:

Repeatable migrations, even without API access

Improved analysis, querying, and debugging

Version control and easy team sharing

Can be combined with other import sources (e.g., CSV)

And what does the future hold?

The talk ends with an exciting outlook: How could AI-powered tools like MCP Server help in the future – by analyzing table structures, identifying inconsistencies, or even suggesting migration scripts?

Watch it now:

Here’s the full recording of Phillip's talk:

Fallback Image

Planning your own migration project and looking for expert support?

Get in touch – we’re happy to help.

Contact