Recent years of economic upheaval have made forecasting demand increasingly difficult for consumer products companies. Facing the largest drop in wealth in decades, consumers became more conservative and searched for ways to save money on household spending. Large purchases were postponed or eliminated, effectively altering traditional demand patterns. Widespread promotions further distorted consumer buying behavior, and the shift to store brands in some cases may change it permanently. This increase in demand volatility and forecast error taxed the supply chain at a time when manufacturers' profits were already being squeezed.
Faced with these challenges, consumer products companies are investing in new ways to predict demand within a volatile environment. Traditional demand planning systems are inefficient at predicting demand in the best of times, but their shortcomings become acute during economic downturns. By definition, seasonal models relying on historical sales fail to accurately predict current demand in volatile markets. Instead, manufacturers need a means to dynamically sense changes in demand before they occur and to automatically feed updates into supply chain management systems.
To mitigate volatility and gain insight into current demand, manufacturers are turning to point-of-sale (POS) data and finding that using it for forecasting is more complex than first anticipated. While POS data provide a picture of recent sales, they are historical in nature, and POS alone does not provide enough information to create a demand plan. Unfortunately, what retailers sold yesterday is not necessarily a good predictor for what they will order tomorrow. While sell-one/ship-one sounds intuitive, experience has shown that this is rarely the case or forecast error would already have been eliminated.
The Limitations of POS
There are two basic approaches to incorporate retailer data into supply chain management systems to predict manufacturer requirements. The "extended" replenishment planning model advocates forecasting at the store level using POS data and then mechanically modeling all the intermediate inventory locations, flows and policies to generate requirements. This approach is sometimes referred to as "flowcasting." There are three major challenges with this approach:
- Scalability — While starting from the retail location and working backwards sounds instinctive, the huge modeling requirements limit practical scalability. Enterprise-wide deployments encompassing all items at all stores would be prohibitively complex and expensive. For example, many consumer products are stocked at tens of thousands of locations; Wal-Mart alone has more than 3,000 stores in the U.S. Managing forecasts, safety stock, lead time and so on at this level is far more difficult than managing the same data for 10 manufacturer warehouses. Plus, it requires knowing your customers' inventory policies, lead times and ordering strategy.
- Conflicting information — It is entirely possible to generate a requirement for 500 cases for today, but there is no guarantee that is what the retailer will order. They may well have ordered 300 cases or 700. It's unclear how to reconcile these conflicting data without additional rules and assumptions like forecast consumption. A better choice would be to model the retailers' actions rather than making assumptions.
- Accuracy — Generating accurate forecasts at the store level is challenging, and counting on all the intermediate actions to happen as modeled is optimistic at best.
This approach could be useful for predicting the next shipment, but it is less likely to be useful to generate an accurate forecast for the entire product supply lead time.
Harness Downstream Data
The second approach is to analyze retailer and consumer behavior and data to predict what is most likely to happen rather than trying to model the extended upstream supply chain starting at the store level. (The Multi-Enterprise Demand Sensing, or MDS, engine from my company, Terra Technology, follows this approach, for example.) Using this approach, a company would analyze the full range of demand signals and inventory information from every supply chain echelon to create a single, more accurate forecast. Software based on demand sensing applies predictive analytics to reconcile the demand signals and create a single operational demand prediction. Unlike traditional time series-based statistical models that generate average historical sales for this time of year (such as triple exponential smoothing), the demand-sensing approach involves focusing on what is actually occurring now within the manufacturer and retailer networks. The net result, we would assert, is more accurate forecasts, especially in volatile markets.
It may seem obvious that retailers and consumers have different ordering and purchasing patterns for different types of products, but it is not something that planners can easily discern. For example, fast-moving consumer goods like paper products tend to have many substitutes, while lower-volume items, such as cosmetics, have more loyal customers. Again, while sell-one/ship-one also sounds intuitive, useful demand signals vary considerably between items. So using rules of thumb and "mechanically" processing the data will lead to inferior answers. As one of our customers points out, processing this quantity of information surpasses the abilities of the human brain and to this end, manufacturers should let the software sort out the predictors. This has the added benefit of freeing the demand planner to focus on more value-added work in demand planning, like promotional adjustments, new items or distribution.
Using downstream data to manage volatile demand is most effective when demand signals are received every day, not weekly or monthly. A manufacturer using daily data has 11 more days to react than a competitor with a 12-day demand latency. Traditionally, demand planners spend a lot of time reviewing forecasts by adjusting history, tweaking modeling parameters and analyzing results before information is published to production. Manual intervention is no longer feasible if daily information is to be processed for every item-location combination. Instead, results need to be published directly to the production planning system each day. This requires that any enterprise-wide downstream data application like my company's MDS application be fully automated, specifically designed for daily granularity and publish results directly to production with minimal human intervention. In doing so, this type of solution can unlock the potential of automating downstream data in mainstream planning and execution systems.
Benefits of Demand Sensing
At the Supply Chain and Logistics Summit in Prague (June 2010), one of the largest consumer products companies shared the importance of demand sensing and downstream data to respond to a new era of consumer demand volatility. Better understanding of demand was central to their success. MDS provided a 60 percent reduction in forecast error — from 50 percent using demand planning to 20 percent with MDS. By harnessing demand sensing and downstream data, they were able to improve service with end-to-end demand visibility, react to changes in near-term demand through an agile operating strategy and improve retailer collaboration to increase on-shelf availability and reduce working capital.
As mixed signs of economic recovery continue around the world, it appears the one near certainty is that volatile consumer demand will be with us for the foreseeable future. Fortunately, manufacturers can now use downstream data to accurately sense changes in demand as they happen and act on them immediately. A scalable and fully automated demand-sensing solution has been proven to improve forecast error as much as 50 percent for multinational consumer products manufacturers in an environment of economic uncertainty, where both consumer choice and retailers' actions are less predictable. So if economic uncertainty stays with us for longer than we hope, at least manufacturers can breathe a little easier now by harnessing downstream data.