- The Infor Insider
- Posts
- Edition 3 - Mastering Data for Better Decisions Part 2
Edition 3 - Mastering Data for Better Decisions Part 2

The Infor Insider
Your guide to enterprise technology, manufacturing insights, exclusive updates, and expertise for business leaders and market influencers. In this Issue we will continue Part 2 of a 4 Part series on the most important 4-letter word in Digital Transformation: DATA…
In this Edition
Success Spotlights - Showcasing how Infor helps our clients deliver results
🏭 ERP Insight: ERP Success Hinges on Platform-Wide Integration
A modern ERP is only as good as its operational backbone. Are your clients leveraging platform-wide integration for true digital transformation? Simply put— RPA and AI can take out inefficiencies out of processes.
Precision & Performance at Ring Container - Watch now
Ring Container is enhancing operations with Infor OS. See how their digital transformation is improving precision and productivity.
Industry Insights - Trends shaping the future of business and technology
📊 Data Series ( 4-parts) : Data Migration is the Achilles’ Heel of ERP Projects
Many companies underestimate the time and effort required to clean, standardize, and govern their data. Are your clients treating data as a strategic asset, or just an IT headache?
We are continuing our first series centered around a formidable component of transformation that often is unaddressed = DATA

Mastering Data for Better Decisions Part 2 of 4: ……to go back to the Archive and read the Part 1- click here
This week we wanted to dive deep— not too deep, but deep enough to provide some insights without glossing over important topics. Enjoy the article below written by our own Brad Symaka- director of data solutions.
The Dirty Work: Identifying, Cleaning, and Migrating Your Data
In today's data-driven business landscape, the quality of your enterprise data directly impacts operational efficiency, decision-making capabilities, and ultimately, your bottom line. Yet, data migration and cleansing remain among the most challenging aspects of any ERP implementation or digital transformation initiative. As organizations grow and evolve, their data often becomes fragmented, duplicated, and inconsistent across multiple systems, creating significant obstacles to achieving a single source of truth.
The Hidden Challenges of Data Management
Many organizations underestimate the complexity of data migration. What appears straightforward on the surface—moving data from one system to another—quickly reveals itself as a multifaceted challenge requiring specialized expertise and tooling.
Unit of Measure is one of the biggest challenges that customers face as they go through a project like this. When merging data from multiple ERP systems, seemingly simple elements like units of measure can create significant complications. Is it dozens or cases? Pounds or kilograms? And how do these conversions impact inventory valuation and operational processes?
Another critical challenge is deduplication. "Whose information wins?" becomes a crucial question when consolidating records. The answer isn't always straightforward—sometimes one division might have better data for certain vendors, while another division maintains more accurate information for others. This complexity requires sophisticated rules and governance to ensure the right data survives the migration process.
The Journey to Clean, Consolidated Data
Successful data management follows a structured approach with these essential steps:
1. Identifying Your Data Sources – Where's the Good, the Bad, and the Ugly?
Before you can clean your data, you must understand what you're working with. The first step is creating a comprehensive inventory of all your data sources, which often extend far beyond your primary systems.
Finding your data across the enterprise:
· Legacy ERP systems that may be partially documented
· Spreadsheets scattered across departments and individual computers
· Homegrown systems built over decades with limited documentation
· Tribal knowledge held by long-tenured employees but never formalized
Common pitfalls to watch for:
· The "mystery spreadsheet" problem: Critical business processes dependent on files with unknown origins or ownership
· Conflicting versions: Multiple copies of the same data with different updates
· Rogue databases: Departmental solutions created to work around system limitations
· Undocumented data transformations happening between systems
Quick wins:
· Create a data inventory that identifies all sources, their owners, and system dependencies
· Document the "data lineage" to understand how information flows between systems
· Establish a data profiling process to assess the quality and completeness of each source
· Identify the most critical data domains that will require the most attention
2. Mapping and Extracting – Making Sense of the Chaos
Once you've identified your data sources, the next challenge is understanding how to extract and map that data in a structured way that preserves relationships and business logic.
The key to structured extraction:
· Understanding how data moves and relates across systems
· Documenting source-to-target mappings for each data element
· Identifying transformation rules needed to standardize data
· Recognizing dependencies between different data domains (e.g., items, customers, vendors)
Common challenges:
· Missing fields: Critical data elements that exist in the target system but not in source systems
· Inconsistent formats: The same data represented differently across systems (dates, addresses, product codes)
· Incompatible structures: Hierarchical data in one system that needs to be flattened or vice versa
· Hidden business rules: Undocumented logic that transforms data as it moves between systems
Practical tips:
· Leverage ETL (Extract, Transform, Load) tools to automate and document the extraction process
· Work closely with Subject Matter Experts (SMEs) who understand the business context of the data
· Sanity-check assumptions by validating mapping rules with actual data samples
· Build a comprehensive data dictionary that standardizes definitions across systems
· Create a mapping document that serves as the blueprint for your migration
3. Cleansing, Deduping, and Mastering – The Quest for the Golden Record
This phase represents the heart of data transformation—where raw, inconsistent data becomes a valuable business asset. It's also typically the most time-consuming and challenging part of any data migration project.
The art (and science) of merging duplicate records:
· Identifying duplicate customers, vendors, and SKUs across multiple systems
· Determining which attributes from each duplicate should survive in the golden record
· Preserving historical transactions and relationships when consolidating records
· Handling special cases where what appears to be a duplicate actually isn't
Rules for defining a "single source of truth":
· Establishing clear criteria for what constitutes a match (exact match vs. fuzzy matching)
· Creating a hierarchy of systems to determine which source "wins" when conflicts arise
· Developing business rules for special cases and exceptions
· Building consensus among stakeholders on how to handle controversial decisions
Why this step takes longer than anyone expects:
· The volume of exceptions that require manual review is typically underestimated
· Business rules often need refinement as edge cases are discovered
· Stakeholders may disagree on what constitutes the "right" data
· The impact of decisions on downstream processes isn't always immediately apparent
How to make it faster:
· Invest in tools that can automate matching and merging based on configurable rules
· Establish a dedicated team with clear decision-making authority
· Create a tiered approach that focuses first on high-impact, high-volume data domains
· Develop a workflow for efficiently handling exceptions that require manual review
· Set realistic expectations about the time and effort required
4. Executing and Automating the Migration Process
With your data identified, mapped, and cleansed, you're ready to execute the migration. However, this is far from a simple "push button" operation.
Why migration isn't a "one and done" event:
· The target system configuration often evolves during implementation
· Business requirements may change, affecting data mapping rules
· Data quality issues continue to be discovered throughout the process
· Testing reveals gaps and inconsistencies that require refinement
· Multiple iterations are needed to ensure accuracy and completeness
Setting up a test migration environment:
· Create sandboxes that mirror the production environment
· Conduct trial runs with representative data samples
· Establish validation procedures to catch errors early
· Develop a feedback loop to refine the migration process
· Involve business users in validating the migrated data
Automating the process for efficiency and accuracy:
· Develop scripts and workflows that can be repeated consistently
· Build in validation checks to flag potential issues
· Create detailed logs to track what was migrated and any exceptions
· Implement error-handling routines to manage failures gracefully
· Establish a version control system for migration scripts and configurations
Avoiding soul-crushing manual work:
· Identify repetitive tasks that can be automated
· Create templates and tools for common migration scenarios
· Develop a library of reusable components for future migrations
· Establish clear handoffs between technical and business teams
· Build in time for troubleshooting and refinement
Wrap-Up: Making Your Data Migration Smooth (and Less Painful)
Key Takeaways
Avoid common pitfalls:
· Don't underestimate the complexity and time required for data migration
· Involve business stakeholders early and often in the process
· Document your decisions and the rationale behind them
· Plan for multiple iterations and refinements
· Build in contingency time for unexpected challenges
Don't skimp on cleansing:
· Invest the time and resources to properly cleanse and master your data
· Address data quality issues at the source whenever possible
· Establish clear rules for handling duplicates and conflicts
· Document your data quality standards and enforce them consistently
· Remember that poor data quality will undermine even the best-designed system
Test, test, test:
· Conduct thorough testing at each stage of the migration process
· Involve business users in validating the migrated data
· Test with representative data samples that cover edge cases
· Verify not just that data was migrated, but that it works correctly in business processes
· Build in time for fixing issues discovered during testing
A Sneak Peek at Post-Migration Data Governance
The work doesn't end when your migration is complete. In fact, maintaining data quality is an ongoing process that requires:
· Establishing data stewardship roles with clear responsibilities for maintaining data quality
· Implementing data quality monitoring to catch issues before they proliferate
· Creating workflows for data changes that include appropriate approvals and validations
· Training users on proper data entry and maintenance procedures
· Regularly auditing data quality and addressing issues promptly
Without proper governance, your clean, well-structured data will gradually degrade, undermining the benefits of your migration effort. In our next installment, we'll explore how to establish effective data governance to keep your ERP system pristine and your data reliable for years to come.
What’s Your Experience
Every organization faces unique challenges when it comes to data migration. We'd love to hear about your experiences:
· What's the biggest data challenge you've faced in an ERP migration?
· How did you handle duplicate records across multiple systems?
· What tools or techniques have you found most effective for data cleansing?
· How have you maintained data quality after migration?
Share your stories, questions, and insights in the comments below. Your experience might just help a fellow professional avoid a costly mistake or discover a more efficient approach to their data challenges.
Conclusion
Data migration may be dirty work, but it's essential work. By investing in proper data identification, mapping, cleansing, automation, and governance, organizations can transform their data from a liability into a strategic asset. The journey requires commitment, expertise, and the right tools, but the destination—a single source of truth that enables confident decision-making and operational excellence—is well worth the effort.
As businesses continue to generate and consume ever-increasing volumes of data, those that master the art and science of data management will gain a significant competitive advantage. The question isn't whether you can afford to invest in data quality—it's whether you can afford not to.
A little more about the Author:
Brad Symaka, Director of Data Solutions | Infor
Brad brings over 20 years of experience in the industry, with over nine years at Infor. Brad has taken on various leadership roles throughout his career, which have helped him develop deep industry knowledge and a track record of successful customer outcomes. His customer-centric approach and ability to partner with global organizations have driven significant achievements. Brad is passionate about helping customers exceed their goals by leveraging innovative solutions and providing trusted, strategic advice. He has played a pivotal role in transforming data management strategies for major clients, leading projects that have significantly increased operational efficiency. Outside of work, Brad enjoys an active lifestyle, engaging in sports like hockey, tennis, cycling, and skiing. Brad resides in Calgary, AB, with his family.
Tune in next week - for PART 3 : Governance- Where Data Meets Discipline
The Road Ahead - What’s new and notable both at Infor and throughout the ERP world
⭐ Newsflash: Infor Velocity Suite is the most approachable and pragmatic approach to adopting AI and other Advanced Technologies ( YES -> not everything is about AI)
Be the among the first to learn about Infor Velocity Suite and how it eliminates barriers to process innovation and advanced technology adoption – April 10th at 10 a.m. EDT and April 11th at 1:00 p.m. AEST.
Don’t miss your spot- Register for this webinar HERE
The Functionality Factor - Features driving industry excellence
🔍💰 ERP Insight: CPQ Aligns Sales, Supply Chain, and Finance
CPQ isn’t just for sales—it impacts supply chain and finance. Are your clients using CPQ as a strategic tool or just a sales automation add-on?

The biggest reason we see clients adopting a CPQ solution is to enable their sales teams, partners or dealers with a faster, more accurate, and differentiating quoting experience.
With a best in class CPQ solution, you’re able to reduce the time to develop an accurate and profitable quote, ensuring you respond to demand faster than your competitors.
Faster quoting, more sales, and higher margins (from increased accuracy and price control).
LASTLY but most importantly- having a CPQ integrated into your ERP system means there is one source of truth, one parts master, and one seamless way to take an order, create a line item, job order, BOM, routing, and deliver a CTP/ATP promise date that you can keep.
Make a promise: Keep a Promise….Infor CPQ
Faster Quotes, Smarter Sales with CPQ - Learn more
Closing Thoughts - Thanks for reading our newsletter! Our goal is to keep you informed on the latest & greatest there is in the ERP world for manufacturing and distribution.
Be on the lookout for the next editions as we will post twice a month!
Was this email forwarded to you? You can subscribe by clicking the button below: