Nearly a month after the closure of the Suez Canal, waves of ships caught in the crosshairs are inundating ports around the world. It’s hardly the first major issue that supply chains have faced in the past year, but the crisis seemed to mark a new truth: disruptions aren’t going away.
Meanwhile, a global economic downturn has heightened awareness and decreased tolerance for these disruptions as businesses try to maintain the bottom line. The damages are adding up, and organizations are rethinking their supply-chain processes and technologies, and re-examining their data sources.
Shifting Focus
Up to this point, supply-chain systems have optimized data to capture detailed information on inventory levels and movement, production levels, energy efficiency, etc. — all internal factors within a company’s control.
This approach is much like driving down the road going 60 miles per hour, looking only at your dashboard. You see that your speed level is under the speed limit, you have plenty of gas and your car isn't overheating. All good news — until you crash into the car in front of you. Why? Because you were only focused on the things happening inside your car.
In many ways, this is how we've focused on supply-chain systems. We've optimized data to capture detailed information on inventory levels and movement, production levels, energy efficiency and more. Our “dashboard” tells us we're not having any problems for all the things we control. But this approach ignores the risk of what's happening externally. Businesses need to focus as much on systematic ways to assess risk and disruptions from outside factors as they do on internal measures.
Improving demand forecasting and safety stock levels will certainly help companies become more efficient, but companies need a 360-degree view that includes robust measurement of external influences to their business.
Alternative Data Sources
A growing number of manufacturers are using alternative data to anticipate future problems in supply from these external influences. They are looking to see if there is a backlog of ships in a given harbor that would indicate that an upcoming shipment can expect to be delayed. They're looking further upstream at potential disruptors to supply, analyzing precipitation and soil humidity levels to predict yields of crops, to ensure the steady availability of raw materials into their manufacturing processes for everything from latex to cocoa. They're monitoring news from thousands of sources around the globe to detect potential strife or strikes and anticipate changes in demand.
Shifting focus to these external upstream factors means dealing with data sets these companies may be completely new to — everything from satellite imagery to ship transponder signals to weather forecasts in areas where raw materials are grown. Forward-thinking manufacturers are turning to new technologies to better handle these new data sets, particularly data lakehouses that enable companies to look at both unstructured and structured sources as frequently as real-time and use them to quickly extract relevant information. They’ve realized that legacy technologies such as data warehouses are inflexible and incapable of dealing with these new types of data or analysis. With a data lakehouse, companies can quickly test these data sets and analysis to make decisions in hours and days.
Rerouting a trip meant to travel through the Suez Canal to instead go around the Cape of Good Hope is expensive — adding at least $450,000 and 3,000 nautical miles to a shipment’s travels. But as we’ve seen this weekend, sitting around and waiting for an unforeseen obstacle to be removed can add even more to that expense.
It’s not a coincidence that Ever Given’s sister ship rerouted around the Cape of Good Hope early — they likely had inside data that this blockage would last a while. If all the backlogged ships had that same information and had it earlier, the industry as a whole would be weathering this storm better. Having a more comprehensive focus on internal/external factors, while leveraging alternative data, is key to handling unexpected challenges.
Rob Saker is global industry leader of retail and manufacturing at Databricks.