As a supply chain professional, you use your skills, knowledge, and abilities to deliver goods to the world. What if you apply these same skills internally to help your company improve the “data supply chain” (DSC)?
Conceptually we know a lot about supply chain management. We use tools, algorithms, and methods to manage the inherent variability in demand and supply to be successful. Yet few organizations are well equipped to manage their DSCs.
While the task and responsibility for managing the DSC is primarily the responsibility of information technology professionals, the concepts and tools we use every day directly apply to the data supply chain as well as the physical supply chain.
Many of us are familiar with the ASCM Scor Model and the “data supply chain” resides solidly in the “Enable” part of the process. What if we think about “data” as the product – the output we are delivering. How might the other SCOR processes apply to data? Let us briefly look at each and see what concepts apply to data.
- Plan – Everything starts with a plan. With the DSC, the need is to plan for data we need to capture. Which systems will we use to source the data? How often will we need the data to be refreshed? How long do we need to keep/store the data? What analysis will the data support? How will “change control” be managed? (Note: in the case of data, change control is not about Form/Fit/Function, but about integrity, accuracy, and identifying any underlying changes to the data itself)
- Source – Physical supply chains require key capabilities like strategic sourcing, incoming quality control/management, and supplier score-carding. These concepts apply to the data supply chain as well.
- We need to make strategic sourcing decision to decide the “System of Record” (SOR) for different data elements needed. Similar data may exist in more than one system, and we need a SOR as THE single system that we use for each specific element. With data we do not want “dual source,” because it reduces data integrity (traceability) and accuracy.
- As we would with physical product, we need to evaluate the “quality” of the data to be used in analytic processes to be sure it is at an acceptable quality level. For data, the measures of quality are well established – accuracy, coverage, completeness, integrity, consistency, precision, timeliness, accessibility.
- Finally, we need to track or scorecard the data we are obtaining from the various SOR. The quest to improve data quality runs parallel to the focus on continuous improvement for physical goods.
- Make – Data does not go through an assembly or build process, but each data element can be thought of as a completed item. That means it must be at the appropriate quality level, and with the appropriate availability for any approved “customer” who needs to use the data. That said, data may need to be transformed to be consistent with other data elements – for example standardize all dates into mm-dd-yyyy format, or all currency values to dollars.
- Deliver/Return – Physical goods often need complex distribution networks to align product supply with customer demand. Often the “last mile” to the customer affects the complexity of the network the concentration or fragmentation of customers impacts the design. With the data supply chain, the implications of delivery surround who is authorized to obtain and use the data.
- Authorization to access data often relates to the context and nature of the data. The more sensitive the data the higher the level of authorization needed. For example, very few people in an organization need access to individual pay rates, and authorization is often highly restricted. Comparatively, access to data on total sales for the prior quarter may be approved for everyone.
In today’s environment our companies are dependent on data in all areas of the business, marketing, sales, service, recruiting/hiring, etc. How might we as supply chain professionals apply our skills to help improve data?
David Angelow is a consultant and educator based in Austin, Texas. He is an Adjunct Professor in the McCoy School of Business at Texas State University leading courses on IT Strategy, Analytics, Operations and Entrepreneurship.