In the context of a Microsoft Business Central deployment, data migration is the crucial step that allows an already active company to quickly regain control of its financial cycle (supplier payments, customer reminders, revenue tracking, etc.).
In most organizations moving towards a SaaS (Software as a service) solution, this migration is primarily carried out by importing Excel files containing lists of customers, suppliers, items, invoices, and other essential records.
While importing these files is technically simple, the real difficulty lies in the ability to follow the migration process in a reliable, transparent, and controlled manner. Insufficient monitoring can lead to duplicates, accounting inconsistencies, or service interruptions that directly affect cash flow and customer relations.
In this article, I share my own methodology for managing data migration on Business Central. It revolves around a succession of clearly defined steps, from preparing the files to verification after migration, including the use of monitoring tools, dashboards, and regular checkpoints.
The objective is to provide project teams with an operational framework that minimizes risks, guarantees the quality of imported data, and ensures a smooth transition to the new ERP system.
Follow me through the different phases of this approach, you will discover how I organize and control each step of the import process to ensure the result of what is a project within the project.
The Requirements
Data migration monitoring relies on several essential requirements:
- Defining a data migration process with stakeholders who hold responsibilities and stages to be validated.
- Defining and maintaining the expected structure for each file (mandatory columns, date formats, separators) to ensure compliance before any import attempt.
- Clearly identifying each data type and its import priority.
- Recording the version of each file provided by the client to be able to trace changes.
- Maintaining an exhaustive log of files received and processing applied (date, operator, result, potential errors).
- Communicating errors.
- Implementing management indicators, such as the total number of files transmitted, the number of files successfully processed, the number of files that generated errors.
These elements, gathered in a monitoring dashboard, allow for total traceability and quick intervention in case of anomaly.
The Process
The migration process begins when the client transmits the data sets in the form of Excel files, each conforming to the predefined format (columns, field types, and separators). The solutions integrator, which I will call the partner, retrieves these files, organizes them in a structured folder hierarchy according to the stages ("Reception", "Tests", "Import", "Validation", and "Archiving").
During the testing phase, the files are checked: schema, consistency, and constraint checks are executed, then any anomalies are recorded in an error report that the partner immediately transmits to the client.
The client then makes the necessary corrections and sends back the revised versions.
As soon as no problem is detected, the files move to the import stage, during which they are loaded into Business Central.
The data is then integrated into the system, with the master and dynamic tables being populated according to the established priority order.
Each file transfer operation (reception, test, import, validation) is recorded in the file monitoring log, thus ensuring a complete trace of the migration process, from the first reception to the final consolidation.
The Files
The Excel files used for migration must absolutely comply with a precise model: each sheet must contain a data table starting on the third line.
The first line must indicate the name of the configuration pack as well as the corresponding table number in Business Central.
The second line serves as detail for each of the columns (clear label, indication "mandatory", "optional" or "to be left blank", as well as the expected data type, numeric, date, text, etc.).
The third line constitutes the header of the table itself, exactly aligned with the previously described columns.
Some columns are mandatory, others are optional, and some should never be filled in. It is crucial to scrupulously respect the indicated data type: a numeric field must contain only numbers, a date must follow the expected format, etc.
Furthermore, some columns require a reference value which must be drawn from another model file (e.g., a supplier code or a unit of measure identifier).

The partner must keep the versions of the models provided to the client and commits to adding new fields at the end of the table without ever changing the existing order of the columns.
As for the client, they commit to filling in the data table while respecting these rules, without altering the structure of the sheet. This discipline ensures that the XML mapping defined in the file corresponds exactly to the requirements of Business Central, thus facilitating error-free import.
The Data Families
The data to be migrated falls into two main families.
Static Data
- First, master data (or master data) groups the central repositories for each module: customers, suppliers, and items.
- Next, subsidiary data, such as customer addresses or supplier bank details (RIB), complement the main records.
- Secondary data, such as payment methods, units of measure, or product categories, enrich the functional context without constituting the main entity.
- Setup data, which allows modifying the behavior of certain flows.
- Finally, posting groups, which allow associating accounting setup with master data. For example, the customer posting groups that associate the accounts receivable account.
Dynamic Data
Dynamic data is what constitutes the accounting and operational history: opening balances, open entries, stock movements, etc. Unlike master data, this information evolves continuously and reflects the company's real activity. In this family we distinguish 2 types of data:
This data will need to be validated through an accounting journal and will therefore not be imported directly into the accounting entry tables.
- Opening balances
- Open customer / vendor entries
- Fixed assets history
- Inventory / stock
This data is in the form of documents and represents the beginning of a flow that generally leads to transactional entries.
They are structured as a header and lines.
- Open sales / purchase orders
- Contracts
The correct consideration of these levels—master, subsidiary/secondary, and dynamic—is essential to ensure a complete and reliable data migration in Business Central.
The Order of Priority
The import sequence must respect a strict hierarchy to avoid dependency conflicts:
we always start with secondary data (units of measure, payment methods, classifications…) because they are required by subsequent settings and master records. Next comes the import of posting groups and general settings (chart of accounts, cost centers, VAT rules), which define the transaction processing framework. Once the functional foundation is in place, master data (customers, suppliers, items) is loaded, followed by their subsidiary data (addresses, bank details, banking coordinates). Finally, dynamic data, opening balances, open entries, stock movements, are imported last, as they rely on all the repositories already present.
In Business Central, a dedicated configuration worksheet records these priorities and serves as a monitoring dashboard. Even if it is not mandatory, its use ensures that each step is executed in the correct order and minimizes the risk of inconsistency during migration.
Data Processing
The partner opens each received file, immediately logs its version in the import tracking register, then verifies the compliance of the structure (number and name of columns, date formats, separators) as well as the XML mapping that links each column of the table to the target fields of Business Central.
Once the mapping is validated, the file is loaded into BC's import configuration module, and the system then signals any validation errors (duplicates, key constraints, out-of-range values). If no anomaly is detected, the data is consolidated into the system according to the defined priority. In case of errors, the partner extracts the detailed list, records it in the error log, and sends it back to the client for correction.
File Tracking
It is on this document that the partner keeps track of each file processed.
The goal is to keep a history of what has been done. This table therefore visualizes information about each file:
- The file name
- The date of provision by the client
- The date of processing by the partner
- The person who processed the file
- The environment in which it was imported
- The status of the control phase
- The status of the data consolidation phase
- The number of records in the file
- The number of errors and their nature
The Constraints
Importing the same type of data multiple times can quickly break the established priority chain for migration. Indeed, each category (secondary data, settings, master, subsidiary, dynamic) depends on the previous ones: references and integrity constraints are correctly resolved only if the records are loaded in the planned order. When a second batch of the same type arrives after higher levels have already been imported, it can create duplicates, overwrite values already consolidated, or introduce orphaned references that no longer find their correspondence in previously populated tables. This desynchronization leads to validation errors (duplicate keys, foreign key constraint violations) and requires restarting previous imports to restore consistency.
Thus, strict adherence to the unique import sequence, or even the consolidation of all files of the same type before moving to the next level, is essential to prevent the repetition of the same data type from compromising the overall system integrity.
Data migration in Microsoft Business Central is not simply a technical file import operation; it is an orchestrated process that requires a rigorous methodology, exhaustive traceability, and adherence to a clearly defined hierarchy of priorities. By identifying the different data types (master, subsidiary, secondary, and dynamic), scrupulously following the import order, logging each file version, each processing log, and each performance indicator, project teams gain visibility and control. The use of a structured folder hierarchy, a configuration worksheet, and a standardized Excel template ensures that every step, from file reception to final consolidation, remains reproducible and auditable.
By applying this method, partners and clients can reduce the risk of errors, avoid integrity breaches, and ensure a smooth transition to the new ERP system, while maintaining the operational continuity essential for the business.