A lot of companies have spent millions of dollars over recent decades on commercial software applications in order to strengthen their supply chain management performance. Many of them, looking back on the results, are becoming alarmed. Whether the investment was related to product lifecycle management (PLM), enterprise resource planning (ERP), manufacturing execution systems (MES), customer requirements management (CRM) or another software application, the unrealized benefits of these investments has serious consequences for the companies' financial performance. Increasingly, the companies involved are finding that the underlying causes of these shortfalls are factors within their control. Many have undertaken initiatives to recover that lost performance, and realize the benefits in purchased component cost, logistics costs, and cycle time reduction they were - quite literally - banking on.
The cost of goods sold (COGS) in most manufacturing companies is heavily weighted toward material, with 65 percent to 75 percent of COGS a common situation. The balance is typically comprised of 25 percent to 30 percent derived of overhead costs (administrative and other types of indirect labor, equipment, facilities and supplies). The smallest slice of the COGS "pie" is usually direct labor, representing less than 10 percent. It is only natural, then, that much of the justification for software applications such as ERP and MES is made up of reductions in the cost of material, inventory and the overhead (indirect labor and facilities) associated with managing it. When companies invest heavily in these systems to more effectively manage their material and inventory related assets, they are expecting to experience higher inventory turns, lower ongoing investment levels in raw materials and supplies, and greater efficiencies in the staff that manages those assets. They expect those improvements to result from better visibility of the assets themselves, better tools for balancing supply and demand, and greater levels of automation that require less human intervention in repetitive decision making around tedious tasks such as generating and managing purchase orders and manufacturing work orders.
One of the most common underlying sources of trouble in these situations is that management takes shortcuts during the implementation of the systems. Perhaps the single greatest error managers make is failing to adequately cleanse the data they are loading into these new systems. Often, in the press of completing the implementation of these systems on time and within budget, the fundamental work of cleansing data, filling in the missing elements of data, and managing the correct conversion of data into the new systems is set aside. Increasingly, it seems, executives in these companies are finding that they are compelled to go back again - after the implementation has failed to yield expected results - and unearth these problems to deal with them.
A second common issue underlying many of these situations is training. People who are expected to operate these systems after they have been implemented need to understand not only how to perform the specific function to which they are assigned, but also what the overall process is within which they are working, and the outcome that the process is expected to produce. Under the pressure of managing the budget and schedule associated with a major systems implementation, the scope element is frequently sacrificed. Increasingly, as senior management "peels back the onion" in these cases, they find that deficient areas include not only data cleansing and testing, but training as well.
The third fundamental problem is less commonly exposed, less understood, and more rarely remediated. It is simply stated: Senior management executives very often expect that software applications such as ERP, MES, etc., will achieve substantive improvements without significant changes in the underlying business processes that software is designed to enable. In more than three decades of doing these implementations, the author has observed that nothing is more often shortchanged than strengthening and validating the business processes that our new software applications manage. The result is that even the very best software manages a wasteful, sub-par business process, and the expected efficiencies never materialize.
In terms of the company's earnings performance, those three fundamental problems typically appear with the greatest impact in the following ways:
1. On the Inventory line of the balance sheet, where reductions are expected that far exceed the reductions experienced. Similarly, but to a smaller degree, on the Accounts Payable line and the Accrued Liability line where materials procurement are involved.
2. On the Material line of the COGS element of the Income Statement, where reductions are expected to result from efficiencies in procurement, grouping of purchases, and improved visibility and balance of demand against supply. Reductions in COGS have a dramatic impact on earnings compared to changes in revenue in most cases. Poor realization of these expected benefits can be devastating to the business case for investments in major software applications.
3. On the Indirect Labor line of the COGS element of the Income Statement, where efficiencies in processing work orders, purchase orders, exception messages, and other administrative processes such as accounting and billing are reflected. This is the most sensitive area of business case savings projections, of course, because these savings are usually expressed in the form of projected headcount reductions.
Like the manufacturer mentioned above, many companies that are experiencing disappointing results from the implementation of major software applications are initiating corrective actions. In most cases, they are finding that remediation requires the completion of unfinished tasks from the preparation and implementation activities included in the original project. The most common are:
1. Data cleanup activities and data strengthening activities. Critical data elements such as bills of material, manufacturing routings, labor and time standards, machine capacities and capabilities, and material lead times are all critical to the fidelity of enterprise resource planning and manufacturing execution systems. Similarly, accurate material quantities, material master records, and part master records as well as design version data and effectivity data are critical to the fidelity of product lifecycle management and product data management systems. Customer records and pricing records are critical to CRM systems, and the list goes on. The bottom line is that every data element whose quality is substandard reduces the software application's fidelity, and ultimately weakens the company's earnings because expected efficiencies are lost.
2. Training. Especially in the areas where decisions are made each day around balancing supply against demand and executing purchase orders, training is hard to over-emphasize. The companies most rapidly correcting this situation have come to recognize that employees need to understand not only how to operate the software, but why they are doing what they are doing - how it fits into the end-to-end business process. Training sessions in these companies are most effectively being deployed when they begin with a description of the business process, and train the employees how to behave in the context of their overall process.
3. Business Process Re-engineering. Companies who find that they have simply "paved the cow path" by putting an attractive software wrapper around fundamentally flawed business processes have to rethink those processes, and remove the non-value-adding steps and associated times. Leading ERP systems all have business process templates incorporated within them, and most of the successful implementers have adapted -- for the most part -- to those generic business processes through a mechanism called "package based re-engineering" as a part of their implementation. (Other software applications such as PLM and manufacturing execution systems are typically less process-prescriptive, which leaves companies a bit more reliant on systems integrators and other consultants to assure that best industry practices are incorporated.) The least successful do things like perform the basic work in spreadsheets or personal databases outside the formal systems, importing only the results into the formal system. The rigidity, lack of responsiveness and sheer error volume introduced by that approach has cost companies millions.
There are other challenges, of course, with corresponding remediation techniques. But as we all learned early on in our professional careers, the Pareto Principle dictates that the "significant few" are the elements to be managed for the greatest success. When it comes to recovering from lower-than-expected results from major software investments, companies undertaking that work are discovering that the elements we have described here are the significant few that are making the difference.
Computer Sciences Corporation
Timely, incisive articles delivered directly to your inbox.