Featured Content

Taking Control of Data Quality

Quality product data can be a real competitive advantage for companies that get it right. However, despite more anticipated disruption across industries in the year ahead, many supply chain professionals who realize the long-term benefits of data accuracy still struggle to gain support for comprehensive and effective data quality programs. -Angela Fernandez, Vice President, Retail Grocery and Foodservice, GS1 US

Many organizations are facing unprecedented pressure in today’s Amazon-dominated world. The reality is that there is a “garbage in, garbage out” cycle of inaccuracies plaguing supply chain data, which can both create inefficiency and negatively impact the consumer experience. For example, one small measurement error can mean a shipment will not fit into the warehouse space assigned, causing a company to incur thousands of dollars in unnecessary costs. Additionally, a product ingredient missing from a product listing can cause an adverse reaction in a particularly vocal consumer using social media, leading to long-term damage to the brand’s reputation.

The good news is that companies which want to become more proactive about data quality do not have to reinvent the wheel in 2018 to improve the condition of product data. They can look inward at current business processes and use the framework established by the GS1 US National Data Quality Program as a blueprint for success.

The program was created by a diverse range of professionals with responsibility for master data management, quality assurance, and other data-related roles who recognized the need for one common approach to improving data quality. GS1 US worked with these passionate data professionals to establish three pillars that each promote product information accuracy.

  • Data governance – By focusing on data governance to support the creation and maintenance of product data based on global standards, organizations can take one of the most important steps to setting up a culture that values data as a strategic asset. Data governance programs serve an important function within an enterprise: setting the parameters for data creation, management and usage, creating processes for resolving data issues, and enabling business users to make decisions based on high-quality data. A solid data governance program formalizes accountability for data management across the organization and ensures that the appropriate people are involved in the process.
  • Education and training protocol – Industries including grocery, retail, healthcare, and foodservice leverage global GS1 standards in their supply chains to provide a common foundation for uniquely identifying products, capturing information about them, and sharing data with other companies. Adoption of these standards and best practices can help eliminate manual processes that are susceptible to error, enable better data interoperability with other organizations, and increase speed-to-market by making data more actionable. Maintaining internal knowledge about standards and proper application of them for data quality is essential for success.
  • Attribute audit – Attributes are the characteristics used to describe products, and they can play an essential role in how organizations stay vigilant about data quality. Organizations can validate data governance processes and institutional knowledge through routine physical audits that compare an actual product to the most recent information shared about that product.

The Outlook

In 2018, businesses across many industries can expect continued relentless disruption. This will only expose bad product data and the inefficiencies it can create. Data quality is no longer a distant “someday” goal or a quaint idealistic dream. Now is the time for companies with a competitive eye to harness the power of data to use 2018 as a springboard for new growth.

Many organizations are facing unprecedented pressure in today’s Amazon-dominated world. The reality is that there is a “garbage in, garbage out” cycle of inaccuracies plaguing supply chain data, which can both create inefficiency and negatively impact the consumer experience. For example, one small measurement error can mean a shipment will not fit into the warehouse space assigned, causing a company to incur thousands of dollars in unnecessary costs. Additionally, a product ingredient missing from a product listing can cause an adverse reaction in a particularly vocal consumer using social media, leading to long-term damage to the brand’s reputation.

The good news is that companies which want to become more proactive about data quality do not have to reinvent the wheel in 2018 to improve the condition of product data. They can look inward at current business processes and use the framework established by the GS1 US National Data Quality Program as a blueprint for success.

The program was created by a diverse range of professionals with responsibility for master data management, quality assurance, and other data-related roles who recognized the need for one common approach to improving data quality. GS1 US worked with these passionate data professionals to establish three pillars that each promote product information accuracy.

  • Data governance – By focusing on data governance to support the creation and maintenance of product data based on global standards, organizations can take one of the most important steps to setting up a culture that values data as a strategic asset. Data governance programs serve an important function within an enterprise: setting the parameters for data creation, management and usage, creating processes for resolving data issues, and enabling business users to make decisions based on high-quality data. A solid data governance program formalizes accountability for data management across the organization and ensures that the appropriate people are involved in the process.
  • Education and training protocol – Industries including grocery, retail, healthcare, and foodservice leverage global GS1 standards in their supply chains to provide a common foundation for uniquely identifying products, capturing information about them, and sharing data with other companies. Adoption of these standards and best practices can help eliminate manual processes that are susceptible to error, enable better data interoperability with other organizations, and increase speed-to-market by making data more actionable. Maintaining internal knowledge about standards and proper application of them for data quality is essential for success.
  • Attribute audit – Attributes are the characteristics used to describe products, and they can play an essential role in how organizations stay vigilant about data quality. Organizations can validate data governance processes and institutional knowledge through routine physical audits that compare an actual product to the most recent information shared about that product.

The Outlook

In 2018, businesses across many industries can expect continued relentless disruption. This will only expose bad product data and the inefficiencies it can create. Data quality is no longer a distant “someday” goal or a quaint idealistic dream. Now is the time for companies with a competitive eye to harness the power of data to use 2018 as a springboard for new growth.