Visit Our Sponsors |
Oh the thankless job of forecasting! When you are right you virtually go unnoticed. But when you are wrong, all eyes are peering through you waiting to hear why the forecast is wrong again. Growing more accustomed to this, over time, you have learned to perfect with greater precision not your forecasting prowess, but instead have nurtured your skills in producing quality excuses. With practice, straight face and elegant replies soon come out with ease. This includes developmental phases such as: (1) the quick to blame others technique like blaming sales for not selling enough, customers for not buying enough, operations for not building enough, or HR for not paying you enough to do a better job; (2) blaming things out of your control--"it was to sunny, rainy, cold, or some how global warming had a part in it"; (3) the technical approach--"but my R-square is good"; (4) the obvious response approach--"well the customer bought more than they were suppose to"; or finally, (5) even more obvious and my personal favorite--"you have to understand that forecasting the future is much more difficult than the past".
We as forecasters come to terms that forecasts are always wrong. We understand volatility, cycles, and other factors and look at them as contributing to a portion of the difference between the demand we forecast and the actual demand that occurs. But is that enough to understand this? Stopping there we are doing our company and our profession a grave disservice, if we only accept error and do little to understand and improve accuracy. No matter how much we try to explain our inaccuracies, we must face the fact that forecast errors are not going anywhere. I will be discussing three things that we need to do with forecast inaccuracies (identify, improve, and prepare) and stress the importance of spending an almost equal time analyzing error as we do forecasting data.
I remember when I was learning to drive a car, my driving instructor made a statement that all accidents are avoidable. Being younger, dumber, and inexperienced, my initial thought was to scoff. With time and wisdom, (and a few accidents), I understand now that for the accidents that were my fault I should have been paying better attention. And for the others that seemed out of my control, I should have been better prepared. So using some of the same logic from lessons learned, I make this statement now with confidence: "All forecast errors are avoidable!" Outside of falling asleep at the wheel and not giving your forecast enough attention, there are only two kinds of inaccuracies: those because of your driving and those caused by other drivers.
The first is by far the most dangerous but fortunately the most avoidable. Every month looking at the results it appears that actual sales are consistently the same direction, either above or below forecast. This is one type of bias. The causes can be numerable: undetected seasonality, management targets in lieu of unconstrained forecast demand, inappropriate weighting inside models, or unseen growth or trends to name a few. The results are all generally the same; either cash locked up in unnecessary inventories or stock outs and customers calling with service issues.
To identify bias, examine forecast error through time. Begin analyzing your forecast error (actual minus forecasted divided by actual for same period) for a greater number of periods. Look at these data points much like you examine your original forecast data. Try to identify cycles, seasonality, and trends. If any of these are observed you have bias. Forecast error should be random (normal variations above and below the average error). Looking at this in terms of a MPE (Mean Percentage Error), the closer to 0 the better. Knowing you now have bias, you find yourself in popular company. Most companies operate with some biases in their forecasts, but most are either undetected or (worse) known and ignored. Now that you have evaluated forecast errors more closely over more periods you should be able to visualize and analytically see them more clearly. You now have opportunity or option to identify the sources and better handle them. It may be demonstrating these biases to management so they understand and determine how to work with or correct them. Or it may be a revelation in your own process that causes you to react and improve. Either way it is critical that this form of inaccuracy is not ignored but rather identified, discussed, and dealt with.
The second thing you need to watch out for is other drivers. Even if you can get to the point where you are completely bias-free, you still are certain never to be 100% accurate. The unfortunate facts are that life happens and there is such a thing as normal variations. Watching out for these other drivers does not mean you need to accept that they will occur, but to be prepared and account for them. The good news is that as these occur with good forecasts and even better processes, these hopefully "near misses" will occur equally above and below your mean error. This allows for, over time, a balance or very close approximation of actual and greater accuracy on larger aggregated data.
Looking at an error from a single period never gives an accurate framework from which to determine variation. Much like you would never look at a single point and surmise that since it occurred once that it will occur ever time, you should look at error variation much in the same way you looked at your original data. It is as important to analyze forecast against actual demand over multiple periods (and use many tools to help identify types of errors), as it is to use these techniques and tools to forecast the demand. To properly distinguish between bias and normal variations, you need multiple points to appropriately determine them. After these normal variations are ferreted-out, even though they are far less predictable, it is still critical to accommodate for them. If no other lesson is learned, even though it may be difficult for forecasters to comprehend, we should understand that customers do not accept normal variation as an excuse. Although solutions for dealing or handling these variations can be countless, the most common (after identifying) affect supply. Among your options may be safety stock, shorting lead times within your supply chain, or increasing manufacturing flexibility. Not neglecting your personal responsibilities on the demand side even with "normal variations," it is always good to continually improve and look at the market intelligence in your own S&OP process to eliminate even more of this type of variation.
The time we spend in data gathering, model development, and forecast processes should include results, accuracy reconciliation, and improvements. Determining why we did not hit the forecast could be as important as what we thought it would be in the first place. Although the adage that forecasts are always wrong may be correct, I would like us all to spend more of our efforts proving that the saying is wrong than making excuses for why the saying is right.
Author: J. Eric Wilson is a forecast analyst and demand planner for TEMPUR-PEDIC INC.
http://www.ibf.org
RELATED CONTENT
RELATED VIDEOS
Timely, incisive articles delivered directly to your inbox.