How to Put Your Master Data Woes To Bed
In our previous post “Why You Should Lose Sleep Over Unhealthy Data,” we studied the telltale signs of unhealthy data: inconsistency, inaccessibility, and incompleteness. You may have spent sleepless nights distressing over these challenges within your organization. The symptoms of unhealthy data often escalate into a classic project sickness of scope creep. Workarounds and side efforts to overcome data quality issues sidetrack from the main goal of the original project, blowing out your schedule and resources. At the end of the day, your poor data could undermine confidence in the results, and still the problem of poor data has not been fixed.
A great place to tackle your data problem is at the root: your master data, such as locations, people, and especially your products. Take the daunting job of cleansing and making your data complete and achievable in smaller steps. Here are five potential ways you can take to tackle the problem head on.
Establish a single source of the truth. First, without any system assigned as the primary source of master data, you will never achieve your goal. You may find yourself in a situation where multiple systems—not including Excel—act as the master data source for different areas. In this case, your organization may need to make a technical investment to establish a single source of master data. When that happens be sure to:
Add a data workstream to transformation initiatives. Companies with an active transformation efforts typically align efforts on the technical system requirements and associated business processes. Getting data from soon-to-be-shuttered legacy systems turns into a massive Extract-Transform-Load (ETL) process, that doesn’t have time built in for quality checks. Efforts should be focused on updating the legacy master data records before the ETL. Create and assign a master data “SWAT team”, with team members coming from operations, supplier relations/procurement, and IT. Fill gaps through supplier follow-ups and internal activities.
Process ownership. If your organization has disparate systems for master data, you likely have inconsistent controls over whom can update. You should establish a responsibility matrix for all master data changes and centralize the process. Where the responsibility lies depends on your business. For example, if you are procuring materials we would suggest starting with the supplier management group as they are on the front lines with your suppliers. Similarly, if your organization manufactures or assembles finished goods then it may make sense to have your production management group own the process.
Add data quality metrics to scorecards. Supplier scorecards typically feature KPIs from transactional data such as on-time delivery, order accuracy rates, and fill rates. These scorecards give solid, wide-ranging insights into operational issues; however, they don’t typically provide any insights into the cost of poor master data quality. For instance, a data completeness metric for all your required fields, for both internal and external consumption, will draw attention to the problem. This is especially useful in industries with high rate of Stock Keeping Unit (SKU) turnover, such as CPG or e-commerce.
Downstream validation. Most master data has to be taken at face value, but product master data can be verified independently. Therefore it is good practice to assign a floor leader at each of your operations facilities, either managed in house or third party, to be responsible for validation upon initial receipt into the facility. Make sure standards are used for dimensional measurements and no receipts can occur without validation. For products in cases, always validate unit of measure (UOM) against a supplier and use a Cubiscanner to validate dimensional data. Ensuring an escalation path back to responsible parties can tackle potential data issues before they occur.
Solving your master data problems should be a proactive activity ahead of major initiatives. Taking the above five proactive steps will improve the consistency, completeness and availability of your master data. Maintaining readily available data that is clean will greatly reduce the time and effort to complete major initiatives such as Six Sigma, predictive analytics, end-to-end optimization of your supply chain, and complexity management.
Register for our upcoming webinar on Dec 3rd with Ford on Supplier Risk Management below!
Written by Desmond Torkornoo and Tim Kachur, consultants at OPS Rules
More Business articles from Business 2 Community: