Fortunately, as the wave of volatility swept the nation, so too did a new wave of data. These massive flows of data - referred to as "big data" - have become a digital resource to power our physical world; a new raw material with the potential to control and tame uncertainty. A new breed of application that is powered by these data flows and naturally adapts to its surroundings is emerging. The blueprint to successfully unlock its value is defined below by the three Laws of Big Data.
The First Law of Big Data: "If a machine can do it, then a machine should do it"
Automation is at the heart of any native big data application for a very simple reason - scale. Even with just retailer and manufacturer data combined, there is simply too much information for people to review daily, let alone make sense of. The human brain quickly becomes overwhelmed. In contrast, machines are particularly well suited for the mind-numbingly repetitive tasks associated with crunching volumes of data, seeking complex correlations, finding meaningful patterns and publishing daily outputs to production and execution systems.
To get a sense at just how enormous big data is for retail, consider that there are more than 200,000 retail locations in the United States. As a manufacturer, now imagine a thousand items at each location, processed 365 days a year. Factor in that retailer data is more than just point-of-sale information, but includes store inventory, warehouse withdrawals and retailer forecasts and other fields, and you quickly enter the realm of big data.
For example, each planner in a large consumer packaged goods company with 1 million forecast series and 70 analysts would need to prepare 30 forecasts every minute of the day. Even if it were technically feasible for a person to process at this speed, there is no time to simply check whether results are meaningful, let alone provide any value. Since adding armies of new people to analyze this data is out of the question, automation is a must-have feature for any big data application.
The first law has a natural extension related to human productivity. Anything a machine can do, it should do - to free people for tasks that machines cannot do. Offloading repetitive work to machines frees supply chain professionals to focus on strategic areas, such as planning promotions or changing network design to improve cost-efficiency. It is not just theoretical - companies that have implemented big data applications to improve forecast accuracy by proactively sensing changes in demand routinely recognize improvements in demand planning performance.
The Second Law of Big Data: "If you don't have a structured, robust solution, you don't have a solution"
"Big data applications" are only transformational when used in a systematic and structured way to drive core operational activities across the enterprise. Though many companies store large quantities of retailer data in demand signal repositories, few solutions use this data for more than account-level business intelligence inquiries such as trade promotion or retail compliance. While useful, these applications are far from transformational. In contrast, big data applications used by leaders in manufacturing are truly enterprise-wide - encompassing most if not all items and locations - and perform a structured, daily analysis of demand signals to power core activities such as short-term forecasts, replenishment, inventory management, transportation planning and supplier visibility.
Since daily human review of results is essentially impossible, these applications must be robust. Sudden input swings to production systems for even a small number of items risk causing disruptive shocks, so big data applications are designed with layers of safeguards to ensure results are always consistent and meaningful. These safeguards are carefully guarded by software developers as core intellectual property because the second law is instrumental to make use of big data. And they work. Some of the world's largest consumer packaged goods companies with the most respected supply chains already rely on big data applications to create and publish daily forecasts directly to production systems for the majority of their business.
The Third Law of Big Data: "If you don't expect dramatic improvements, don't bother"
Wading into the big data arena is a big step, so if you can achieve the same results with traditional methods, then do so. Since 1958, Moore's law has been tirelessly driving the availability of computing power, so the concept of running more sophisticated models is nothing new. The game changer comes from harnessing the recent explosion of data. Internet-scale data sets have become a digital raw material to facilitate change. For the retail industry, these data sets contain a wealth of information about the current state of the supply chain. The challenge comes from the sheer scale of these data sets - it's an ocean of data, with valuable information lost in a sea of noise. Simply super-sizing existing applications and computing platforms just results in more complex systems rooted in old paradigms. There is a reason Yahoo didn't create Facebook - they were too focused on what was instead of what could be. What's required is a new generation of native applications that are built from the ground up with big data at the center. For all the talk of big data by the traditional ERP vendors, look for real innovation to come from emerging companies that are not beholden to legacy systems and ingrained paradigms.
With all of this data, applications are able to do new and wonderful things with a granularity that was previously beyond our reach. For example, instead of using segmentation to apply rules of thumb across a group of items, actual settings for individual items at each location can now be determined. Better yet, these are now based on current and relevant information from the supply chain, not historical averages. The result can be a step-change reduction in forecast error of 40 percent or more. This is the kind of game changing impact that comes with the promise of big data.
Leaders are starting to look further for yet another step change in performance. By connecting supply chains of manufacturers and their trading partners, big data applications are starting to extend planning activities to include detailed modeling at retailer facilities for next generation vendor-managed inventory capabilities. The payoff is improved on-shelf availability and a net reduction of inventory from the supply chain in a way that was not possible just a year ago. It is a great example of big data enabling collaboration that works.
Big Data Now - Agility to Thrive in Volatile Markets
Fortunately for the retail industry, big data applications are being widely deployed by some of the world's largest manufacturers. In North America alone, big data applications are used to plan approximately one third of all consumer packaged goods trade, representing over $100bn of annual revenue. Following the First Law, these companies use automated demand sensing applications to process masses of supply chain data every day and publish daily forecasts directly to production systems - all without human review. This structured and robust approach gives them accurate, current and consistent forecasts (Second Law) that reflect current supply chain realities and allow companies to make better production and deployment decisions. By using big data, manufacturers achieve a step change in performance across every part of their business, from promotions to new products, from the fast moving 2 percent of items to the slower moving 85 percent of items, for a solid return on investment (Third Law).
Furthermore, the systematic and automated use of daily data gives these companies an unprecedented opportunity to sense and quickly react to supply chain disruptions. In a global survey by Business Continuity Institute of 550 companies in 60 countries, 85 percent reported at least one significant disruption to their supply chain, with more than 50 percent of disruptions related to adverse weather. How companies plan for and react to disruptions has direct financial implications and has become an increasingly important part of corporate strategy.
Demand for tissue sales, for example, is particularly susceptible to the severity of the winter and flu season. With this year's unusually warm winter, companies building to historical demand naturally overproduced. Whereas those who used big data to sense demand mitigated this risk; the automated daily analysis of retailer data identified the change in demand and companies were able to scale back production accordingly. Despite the unexpectedly warm winter, one tissue manufacturer reported that by sensing demand they achieved the lowest inventories and the highest service levels in their history.
The same holds true for growth opportunities during particularly severe winters and flu seasons. During the H1N1 pandemic of 2009, a manufacturer indirectly tracked the spread of the virus faster than the Center for Disease Control by looking at changes in daily tissue forecasts. Not surprisingly, consumers reached for tissues at the first sign of a runny nose, well in advance of full blown symptoms and doctor visits. While tissue forecasts as an outbreak precursor is fascinating, the real story is that the manufacturer was able to identify the change in demand and respond by shifting production and deployment. While competitors experienced stock-outs, their products were on-shelf, capturing an unexpected lift in revenue and building brand loyalty.
Likewise, when Hurricane Irene hit the Northeast and Mid-Atlantic states last year, every manufacturer supplying this market was impacted as consumers hoarded goods and broke their normal routines. However, companies that sensed demand had a distinct competitive advantage. A producer of fast-moving family care products reported that by systematically processing daily retailer data, they were able to tell when customers shifted from buying water and generators and began to settle back into normal routines. This allowed them to dynamically adjust deployment plans to ensure products were in place as soon as demand returned.
Solutions to Help Us Win
Using big data to create an agile supply chain that thrives in volatility contribute to the balance sheet, cash flow and income statements. Despite market volatility, companies using real information to sense and quickly respond to changes in demand can confidently cut inventory based on real information, reducing working capital requirements and freeing cash. Consumer packaged goods companies often see a reduction around five days of inventory, which for large manufacturers can represent hundreds of millions of dollars. Better visibility to future demand lowers operating costs. Getting products in the right place the first time means fewer instances of transhipments and freight expedites. Improved service levels translate into fewer lost sales and higher revenues, which can be especially important in more mature, lower-growth markets such as North America and Europe. One large consumer packaged goods company stated that they were missing out on a billion dollars every year through stock-outs. Closing this gap is even more important as we leave the recession behind us.
All of these benefits are made possible by the three Laws of Big Data. As we stand at the edge of the new era and look forward there is a new world of possibilities. Big data applications will shape our lives in ways that are hard to even predict at this point in time. But one thing is certain, the solutions that win in this era of big data - that will undoubtedly transform business and society - will all adhere to the three Laws.
Source: Terra Technology
Keywords: Forecasting & Demand Planning, Business Intelligence & Analytics, Technology, Retail, CPG, Business Strategy Alignment, Quality & Metrics, Supply Chain Analysis & Consulting, Global Supply Chain Management, Data Mining, Demand Forecasting Performance
Timely, incisive articles delivered directly to your inbox.