Visit Our Sponsors
Over the last nine years, many decisions have been made with scant attention to interest or inflation, because they have been minimal. We have almost forgotten how quickly small changes in borrowing costs can turn otherwise solid models upside down – and in procurement, the model that causes the most damage might be that of a supplier's, or of their supplier's. The impact of interest rate changes will be felt in the cost to carry inventory, the health of suppliers and the impact of payment terms, among many other dimensions. Small costs will snowball because no one thinks to factor in new rates.
Beyond the Time Bombs, There is Hidden Data
The lengthy run of low rates has led to another significant hurdle for procurement managers. The data needed to study the impact of rising rates is largely unreachable, dispersed and incompatible. Managers might be tracking suppliers with the biggest contracts, the worst delivery record, or the most sensitive goods, but they aren’t as close to those with the most debt, highest inventory or shortest payment terms. If it is actively tracked, that data is hidden away in several different systems, held by line of business executives or finance departments in multiple divisions if it is even being actively tracked.
Spend and inventory data resides everywhere – from ERP to Sharepoint and Excel spreadsheets to public databases on the Web – making it difficult to get a complete picture of total purchasing. In fact, it is so difficult, all too often companies end up using only a portion of data in their sourcing analysis, instead of what they need to get a complete picture.
Traditionally, finding, cleaning and preparing data for sourcing analytics has been a manual, slow process that is not repeatable once the current analysis is done. Every data unification project is an expensive one-off. It doesn’t take much of a cost/benefit analysis to show that in many cases, the work isn’t worth the savings.
When it comes to interest rates, however, the cost of missing a time bomb such as sudden pricing increases or part shortages as suppliers adjust their business practices to accommodate the new rates could be severe. It’s time to modernize data preparation.
Don’t Consolidate Data – Unify It
Until recently, companies tried to make all employees follow top-down strictures on data formatting. Despite countless strategies and herculean efforts, data variety persists. Brute force regulation of data doesn’t work, and brute force wrangling of data is too painful. Organizations need a way to organize data that doesn’t impose a system from the top down or reformat existing data sources.
Data unification is a recent development in data analysis that brings data from various sources together without imposing any external structure on the original sources. And it is particularly suited to this challenge of investigating rate hike sensitivities. Data unification aims to catalog all available and relevant data sets, and then connect those data sets for use as if they were a single entity. All of the data regarding spend and inventory can be treated as a common resource, and any analysis is easily repeatable or can even be automated because new data sources can be folded into the mix.
That last part is particularly important as these rate hikes will come over time, and it will be important to check in regularly for developing issues. Further, the ability to see all relevant data as part of the analysis is important not only to catch outliers that are particularly interest rate sensitive, but to understand the global exposure to rate changes, and how it manifests itself so corrective or precautionary measures can take effect in time.
A Two-Step Process, With Lasting Impacts
The first step to understanding interest rate sensitivities is cataloging data sources. Given the variety and dispersed nature of these sources, it is important to keep a good inventory of available data. Most people in any organization only have immediate access to about ten percent of the company’s data. A reasonable review of interest rate exposure could include ERP data, to understand total exposure to each supplier, spreadsheets and contracts, containing supplier inventory requirements, and external data with supplier financials. There are free data catalog tools that can help analysts discover, organize and understand procurement data in the organization. It is important to get the tools distributed throughout the company so data owners can easily plug their sources into the catalog.
From there, analysts can evaluate fields in each source relevant to their research, to establish where data from each source relates to data in other sources. This data connection process is no small feat. It requires intelligence that is spread around the company, and a global analysis like this could require dozens or hundreds of sources.
To make the road easier, companies are relying on a combination of machine learning and expert sourcing. Machine learning tools can identify data sets that have strong correlation based on data similarities, past matches and other factors to help analysts quickly combine multiple fields. Where machine learning and the analyst’s own judgment fall short, a trusted network of experts in the data are called upon to clear up issues. This, too, can be aided by machine learning, having software help to identify the best expert for each question.
The Long-lasting Benefits of Data Unification
Once the data is clean and ready for analysis, a handful of tests can uncover risks and opportunities. For example, companies might rank suppliers according to the size of their contracts and debt levels, putting plans in place should those vendors falter. They might investigate which suppliers carry high inventory or low volume and custom part, working with those vendors to increase advance ordering or implement just-in-time strategies. They can find outliers with unusually aggressive discounts for early payments and negotiate even stronger incentives.
From there, it’s a small matter to bring new or changing data into the existing views, especially if machine learning and expert resources are used. This reusability becomes a permanent part of spend analysis, which has several implications. For example, testing suppliers for fiscal health under increased rates might reveal that the vendor base is not as diversified as expected because a holding company owns many suppliers. Or it might reveal that one division has far better payment terms from a supplier than many others. A unified view of sourcing data can be analyzed several ways that bring far more of a company’s spend under control.
A global diversified manufacturing company underwent a review of suppliers last year that aggregated a list of more than five hundred thousand suppliers across hundreds of ERP systems, discovered 25 to 35 percent supplier overlap across several major business units, and identified more than $100m in savings through payment terms analytics.
The same process that finds interest rate time bombs can uncover inefficiencies in the long tail of suppliers that not only cover the cost of the work, but can save an additional 1 to 2 percent of total spend. That could be enough to offset the effects of the interest rate increases entirely.
Data unification is the best way to find interest rate hike exposure quickly enough to protect the company to defuse issues before they explode, and provides the same level of visibility into the long tail of spend as companies currently have into their top suppliers. The long-term impact of bringing the vast majority of spend under control will pay dividends well beyond the current slate of rate hikes.
Source: Tamr Inc.
Timely, incisive articles delivered directly to your inbox.