SCB Magazine - 2018 Annual Resource Guide
Solving the Problem of 'Too Much Data'

Conversations with supply-chain executives about how companies can swim in the sea of big data without drowning in it.

Solving the Problem of 'Too Much Data'

Businesses have embraced the era of big data with open arms. The more information about market supply and demand that’s available to them, the better they can plan their supply chains to meet changing customer needs. Or so goes the theory. The problem is that many companies are in serious danger of becoming inundated with data. They simply can’t make sense of it all, can’t separate valuable information from noise. Following are excerpts from conversations between SupplyChainBrain editors and industry experts about how supply chains are coping with the challenge of “too much data.”

Travis Lachinski, senior product manager, U.S. Bank Freight Payment: The way that shippers are solving the problem of having too much data is by focusing first on identifying what it is they’re trying to solve with the data, then finding a number of avenues by which they can tackle that problem. Traditionally, there are a few different ways that shippers can go about solving the problem. One is that they can do it manually, through the expenditure of a lot of hours and probably Excel spreadsheets, and use a lot of resources to dig out the specific data that they’re looking to get at. Two, they can buy some type of software, through their ERP [enterprise resource planning] systems or other avenues, and go at it through a technology purchase. Or three, they can use an outsourced provider, and partner with somebody who can provide them with that insight.

Adrian Kumar, Vice President Solutions Design Americas, DHL Supply Chain: You always have to draw the line. How much is too much? If you’re just constantly analyzing data and doing nothing, that’s not going to lead to anything. Artificial intelligence and big data analysis are allowing us to get smarter and smarter. They can take all this data and do things extremely quickly. The system can understand you as a consumer – what are your shopping patterns, what are your likes? It can even predict what you might need in the future. You can have too much data, but if you know what to do with it, you probably can’t have enough data.

Lachinski: The roots of the problem lie in the fact that for years, we’ve heard so much about big data. Companies are trying to find as much data as they can, and they’re incorporating it into these vast data pools. Now, they’re sitting on all kinds of data from their TMS [transportation management systems], invoices, freight payments, and providers like ourselves. They have data from all types of different sources. The problem of too much data is that they’re trying to weed their way through consolidating it to fit a specific business need.

Sanjiv Gupta, President and Chief Executive Officer, OpsVeda, Inc.: Data tends to be the biggest problem that companies will tell you about. It’s not the notion of being able to collect data – most companies are dealing with a deluge of data of it. It’s being able to make sense of the data that’s coming in from various sources. If you can do that in real time, you have significant capability for your business. If it’s all going into a so-called data lake, it might soon become a data swamp.

Carlos Nuñez, Director, Carrier Development, US Foods: It’s important to understand your business objectives, both financially and service-wise. You need to understand at the very highest level what’s important. What are the key results that you’re trying to accomplish? Then you can start to cascade that down through KPIs [key performance indicators] and metrics that are relevant, and can align with key results. By understanding what your KPIs are, you can home in on any vulnerabilities within those key metrics. It’s important to align with your business objectives, but make sure you have your entire staff and everybody rowing in the same direction in terms of those metrics.

Lachinski: A lot of the key to determining what data is valuable to you lies in making sure that it’s as accurate as possible. It’s about ensuring that you don’t end up with a pool of data that has missing data sets or isn’t reliable. Much of the data that you’re using for daily reporting can be used for your big-data needs as well.

Nuñez: I’ll tell you something that’s relevant and timely: It’s the headwinds that we’re experiencing in the marketplace today. It’s about being able to dissect and diagnose why a tender acceptance might be dropping. Problems relating to truckload capacity are very real. It’s about understanding what’s contributing to the root cause, and analyzing for corrective action. Everyone’s experiencing a bit of the strain from the capacity crunch. Tender acceptance is key to a company’s success, especially a logistics program that counts on primaries to achieve savings and service expectations on various lanes.

Lachinski: The most specific technology can take the data and go so far as to weave out some of the data that’s not as good. If you have a lot of data that’s missing a few elements, it can help you to eliminate those points, so that you can achieve greater accuracy.

Nuñez: These days, there are visual analytics tools that you can leverage. The important thing is to get to a quick hit list of opportunities, and make it actionable for all stakeholders. It’s about timing, and being able to get ahead of a problem. And it’s about getting a concise, prescriptive list of opportunities that you can take action against.

Steven Carnovale, Assistant Professor of Supply Chain Management, Portland State University: When you break down analytics, there are three overarching components involved. There’s descriptive analytics. Historically, this was called reporting. It tells us what’s happening, what the present state of things is, and how we can start to figure it out. But it doesn’t tell us why. For example, you might see a rapid drop in a stock. We know that it dropped, but we don’t necessarily know why. So the next component of the analytics road map is predictive analytics. It allows us to transition away from merely looking at what has happened, and to start thinking about why it happened, and what might happen next. Predictive analytics is fueled by descriptive, in the sense that we need the first stage to tell us how we can start digging. Now we go to the third stage, and talk about prescriptive analytics. This is the culmination of the analytics toolkit. It takes information from the predictive function and tells us what we should do about it, so we can implement managerial solutions.

Chris Gordon, Vice President North America, AIMMS: There’s a lot of confusion out there. Many companies are able to gain insight from the use of descriptive analytics or visualization. But there’s no real business value in it. They’ve got to do something different to create it. So organizations have started looking much more closely at prescriptive analytics.

Lachinski: One of the ways that shippers can make use of the data is by considering augmenting it with additional data. Think about a shipper that’s sitting on a lot of data already. It may need some additional sources in order to provide an external view. Take benchmarking, for example. You can look at all of your own data to understand what you’re paying, and expect to pay. But until you use a benchmarking type of data source, you won’t understand how you’re comparing to your peers in the industry.

Nuñez: It’s important to remember that data is only as good as what you actually import into your systems, and what you’re collecting from various sources. Data integrity is always one of the risks. For example, information that we receive from a carrier might be inaccurate or incomplete. It’s about setting expectations early with providers, and ensuring that they understand what’s required. You need to incentivize them to want to be able to provide quality data. Data integrity is one of the potential challenges that companies face today.

Carnovale: It’s a way of thinking about data, and how we can make better decisions scientifically and rigorously.

Lachinski: In the future, I believe, shippers are going to start getting their arms around how to use the data. Companies such as ourselves and other vendors are starting to find ways to help shippers use the data that they have, and move their businesses forward.

To Download the whitepaper, click here

To View the Video in its entirety, click here

Resource links:
U.S. Bank
DHL Supply Chain
OpsVeda, Inc.
US Foods
Portland State University
AIMMS

Businesses have embraced the era of big data with open arms. The more information about market supply and demand that’s available to them, the better they can plan their supply chains to meet changing customer needs. Or so goes the theory. The problem is that many companies are in serious danger of becoming inundated with data. They simply can’t make sense of it all, can’t separate valuable information from noise. Following are excerpts from conversations between SupplyChainBrain editors and industry experts about how supply chains are coping with the challenge of “too much data.”

Travis Lachinski, senior product manager, U.S. Bank Freight Payment: The way that shippers are solving the problem of having too much data is by focusing first on identifying what it is they’re trying to solve with the data, then finding a number of avenues by which they can tackle that problem. Traditionally, there are a few different ways that shippers can go about solving the problem. One is that they can do it manually, through the expenditure of a lot of hours and probably Excel spreadsheets, and use a lot of resources to dig out the specific data that they’re looking to get at. Two, they can buy some type of software, through their ERP [enterprise resource planning] systems or other avenues, and go at it through a technology purchase. Or three, they can use an outsourced provider, and partner with somebody who can provide them with that insight.

Adrian Kumar, Vice President Solutions Design Americas, DHL Supply Chain: You always have to draw the line. How much is too much? If you’re just constantly analyzing data and doing nothing, that’s not going to lead to anything. Artificial intelligence and big data analysis are allowing us to get smarter and smarter. They can take all this data and do things extremely quickly. The system can understand you as a consumer – what are your shopping patterns, what are your likes? It can even predict what you might need in the future. You can have too much data, but if you know what to do with it, you probably can’t have enough data.

Lachinski: The roots of the problem lie in the fact that for years, we’ve heard so much about big data. Companies are trying to find as much data as they can, and they’re incorporating it into these vast data pools. Now, they’re sitting on all kinds of data from their TMS [transportation management systems], invoices, freight payments, and providers like ourselves. They have data from all types of different sources. The problem of too much data is that they’re trying to weed their way through consolidating it to fit a specific business need.

Sanjiv Gupta, President and Chief Executive Officer, OpsVeda, Inc.: Data tends to be the biggest problem that companies will tell you about. It’s not the notion of being able to collect data – most companies are dealing with a deluge of data of it. It’s being able to make sense of the data that’s coming in from various sources. If you can do that in real time, you have significant capability for your business. If it’s all going into a so-called data lake, it might soon become a data swamp.

Carlos Nuñez, Director, Carrier Development, US Foods: It’s important to understand your business objectives, both financially and service-wise. You need to understand at the very highest level what’s important. What are the key results that you’re trying to accomplish? Then you can start to cascade that down through KPIs [key performance indicators] and metrics that are relevant, and can align with key results. By understanding what your KPIs are, you can home in on any vulnerabilities within those key metrics. It’s important to align with your business objectives, but make sure you have your entire staff and everybody rowing in the same direction in terms of those metrics.

Lachinski: A lot of the key to determining what data is valuable to you lies in making sure that it’s as accurate as possible. It’s about ensuring that you don’t end up with a pool of data that has missing data sets or isn’t reliable. Much of the data that you’re using for daily reporting can be used for your big-data needs as well.

Nuñez: I’ll tell you something that’s relevant and timely: It’s the headwinds that we’re experiencing in the marketplace today. It’s about being able to dissect and diagnose why a tender acceptance might be dropping. Problems relating to truckload capacity are very real. It’s about understanding what’s contributing to the root cause, and analyzing for corrective action. Everyone’s experiencing a bit of the strain from the capacity crunch. Tender acceptance is key to a company’s success, especially a logistics program that counts on primaries to achieve savings and service expectations on various lanes.

Lachinski: The most specific technology can take the data and go so far as to weave out some of the data that’s not as good. If you have a lot of data that’s missing a few elements, it can help you to eliminate those points, so that you can achieve greater accuracy.

Nuñez: These days, there are visual analytics tools that you can leverage. The important thing is to get to a quick hit list of opportunities, and make it actionable for all stakeholders. It’s about timing, and being able to get ahead of a problem. And it’s about getting a concise, prescriptive list of opportunities that you can take action against.

Steven Carnovale, Assistant Professor of Supply Chain Management, Portland State University: When you break down analytics, there are three overarching components involved. There’s descriptive analytics. Historically, this was called reporting. It tells us what’s happening, what the present state of things is, and how we can start to figure it out. But it doesn’t tell us why. For example, you might see a rapid drop in a stock. We know that it dropped, but we don’t necessarily know why. So the next component of the analytics road map is predictive analytics. It allows us to transition away from merely looking at what has happened, and to start thinking about why it happened, and what might happen next. Predictive analytics is fueled by descriptive, in the sense that we need the first stage to tell us how we can start digging. Now we go to the third stage, and talk about prescriptive analytics. This is the culmination of the analytics toolkit. It takes information from the predictive function and tells us what we should do about it, so we can implement managerial solutions.

Chris Gordon, Vice President North America, AIMMS: There’s a lot of confusion out there. Many companies are able to gain insight from the use of descriptive analytics or visualization. But there’s no real business value in it. They’ve got to do something different to create it. So organizations have started looking much more closely at prescriptive analytics.

Lachinski: One of the ways that shippers can make use of the data is by considering augmenting it with additional data. Think about a shipper that’s sitting on a lot of data already. It may need some additional sources in order to provide an external view. Take benchmarking, for example. You can look at all of your own data to understand what you’re paying, and expect to pay. But until you use a benchmarking type of data source, you won’t understand how you’re comparing to your peers in the industry.

Nuñez: It’s important to remember that data is only as good as what you actually import into your systems, and what you’re collecting from various sources. Data integrity is always one of the risks. For example, information that we receive from a carrier might be inaccurate or incomplete. It’s about setting expectations early with providers, and ensuring that they understand what’s required. You need to incentivize them to want to be able to provide quality data. Data integrity is one of the potential challenges that companies face today.

Carnovale: It’s a way of thinking about data, and how we can make better decisions scientifically and rigorously.

Lachinski: In the future, I believe, shippers are going to start getting their arms around how to use the data. Companies such as ourselves and other vendors are starting to find ways to help shippers use the data that they have, and move their businesses forward.

To Download the whitepaper, click here

To View the Video in its entirety, click here

Resource links:
U.S. Bank
DHL Supply Chain
OpsVeda, Inc.
US Foods
Portland State University
AIMMS

Solving the Problem of 'Too Much Data'