Volume 16 | Issue 3
How best to turn mountainous data into actionable information? It’s an important question that needs immediate addressing.
While some companies curse the landslide of data overwhelming their organizations, others look to capitalize on what they realize is an opportunity to better understand their customers, suppliers, and costs.
But the challenge is more nuanced than sculpting a huge collection of unwieldy data. Many organizations find that instead of a “Big Data” problem, they have hundreds of “Little Data” problems, or “molehills” of unconnected data around the world.
An Example
One manufacturer – a Fortune 500 global company – was experiencing this problem in spades. It had more than 100 ERP instances that didn’t communicate with each other. Therein lay the challenge: shoveling together all those molehills into one cogent, structure of data.
But before this organization could even think about employing analytics to understand their global spend on goods and services, it needed to mold that spend into one comprehensive system. Most of the time, project teams will recommend migrating to a single ERP – that’s standard practice for growth by acquisition.
“The ERP vendors do a very good job of selling their software as the magic bullet, of being able to get all the data pulled together and gain visibility across the organization,” says Powers. “While in many cases that may be true, it’s still a very painful and time-consuming process. There’s no guarantee that it’s going to provide exactly the information you need.”
For many reasons (e.g., time, resources, cost), moving to a common ERP system was out of the question for this organization. Ultimately, this left the parent company without any visibility into its collective spend – a data-rich, information-poor world. In other words, the company would have to roll all those little molehills into one big data mountain.
“We also see this scenario a lot with private equity groups, where you have a PE firm with a number of portfolio companies, all on different platforms,” says Powers. “Sometimes extracting the needed data can be very difficult and time consuming for all those portfolio companies. Typically, the PE firms cannot independently access portfolio company data and must request individual data extractions from each portfolio company.”
Daunting Task
Given the predicament, a US-based manufacturing parent company with subsidiaries manufacturing goods around the world faced a daunting task before it could launch a strategic sourcing initiative.
Getting any spend data out of their myriad of systems would be a time-consuming, challenging process. Additionally, ensuring the comparability of data across 100-plus systems and multiple organizations would be another obstacle to assembling clean data for the global sourcing initiative. With spend on goods and services occurring at well over 100 locations globally and estimated at over $3 billion annually, any solution to capture the data needed to be easy to implement and manage.
As part of the strategic sourcing initiative, the parent company wanted to leverage the historical spend data residing in their disparate systems, to better understand their total company-wide spend and execute eSourcing events. They also wanted the ability to effectively manage their supplier contracts.
Two major tasks were required to set up this initiative:
Tool Selection and Implementation
The client’s strategic sourcing priorities were:
To ensure selection of the best tool(s) for the client, the project team conducted a thorough sourcing effort for the best-fit package that provided both a spend management front-end and a Data Warehousing (DW) back-end infrastructure.
During the selection process, the team determined that:
The data normalization, validation and loading could be further customized with existing technology.
The tool selected contained the required features and functionality in all three areas.
“This allows you to pull that information together relatively quickly,” indicates Powers. “I say relatively quickly because an ERP implementation can range from 12 to nearly 36 months, based on complexity. I know companies that have been doing ERP implementations for a couple of years and they’re still not done. When done right, the Spend Analysis tool allows organizations to have quick and easy access to that ‘right’ data, through regular monthly data feeds.”
Getting Portfolio Company Buy-In
Due to the decentralized structure of the business, the client’s management team chose not to “mandate” the project. Instead, management asked the project team leaders to meet with the leadership team of each company to discuss the project benefits and what was needed from them to execute.
Another important step to earn executive buy-in was to give a live demo of the tool to the Corporate Sourcing Group and procurement teams at each portfolio company. The biggest benefit of this effort was showing company leadership the type of information they, and their team, would have access to on a regular basis.
When discussing the reporting capabilities of the tool, one of the group presidents said he thought the company could already create these types of reports with its existing systems. Sure, the company’s financial team could – but it was a very manual and time-consuming process. If he requested consolidated spend information be delivered to him next week, he unknowingly disrupted the weekend for his procurement and finance teams.
Professionals in a procurement group are not hired and retained because of their skills using pivot tables and merging various excel worksheets into a combined schedule. To do their jobs effectively, they need timely and accurate spend information that is easily accessible. “Without that data, they are trying to fight the strategic sourcing battle with one hand tied behind their backs,” says Powers.
Once data is included in a Spend Analysis tool, trained users can easily create reports that present customized views of the data that then allow them to execute their strategic sourcing function.
“What a senior leader once referred to as ‘aggressive data manipulation’ would now be easily accessible without any excel voodoo; a task any procurement professional could complete in a matter of minutes, not days,” reveals Powers.
Acquisition of Historical Data
Selection of the best possible tool was important, but acquisition of actual spend data from so many systems proved the real challenge.
It was decided by the client’s leadership team that the tool should be populated with three years of historical spend data. This meant that in addition to extracting data from over 100 separate ERP systems, there would be about 40 legacy (dead) systems from which spend data would also need to be extracted.
The tool needed to be populated with data from the purchase order, accounts payable, and goods receipt modules of the ERP system(s) at each company. The goal was to capture information from these systems without making changes to the “local” ERP systems. By creating a standardized data format in a proprietary data warehouse, IT staff at each company could extract the required data fields consistently to populate the tool.
Hundreds of tests and validations on millions of records were performed on the data submitted from each ERP system. The proprietary software automated the data file validation process, and generated error reports with details on required resolution.
Approving the Data
Even though the project team conducted data tests, each company owned its own data and therefore, was required to sign off on the completeness and reasonableness of their submitted data. The approval process provided summary level data on Total Spend and Spend by Supplier to company leaders for their review and approval before their data was added to the tool.
“Enabling a data approval process was extremely important because the data was delivered to us from disparate sources,” says Powers. “We wanted to give company leadership the opportunity to review the data and ensure that the data was as clean and complete as possible.”
Categorization of the Spend
For the submitted data to be accurately analyzed, it needed to be properly categorized. Many tools provide a classification engine that leverages a library of rules that appropriately classify data to allow for meaningful analysis. To verify the output of this automated process and enhance the data classification, there needs to be a detailed review done by those who really know the data.
Training the Users
Information is only valuable if it can be accessed by those that use it to make intelligent and sound decisions. That’s why training was the most important step. More than 450 users were trained through a customized three-day training course delivered in eight countries across North America, Europe and Asia.
“Training was critical, as this system was revolutionary in enabling organization-wide visibility to support more collaborative and strategic activities,” says Powers. “Users needed to know what information was available and how to access it.”
The risk: Without a clear understanding of the system, employees could run searches incorrectly and receive incorrect or incomplete information, adds Powers. “Our training helped ensure that the manufacturing entities could not only access the tremendous amount of new data, but also generate accurate reports.”
Now when the parent company and their many global entities need to access details of their direct and indirect spend with nearly 100,000 suppliers of goods and services, it’s simple. “All they need do is log into their spend analysis tool and search for the information they need,” says Powers. “Once a user is in the tool, the average time to complete a targeted search for critical information on their global spend is about two minutes.”
The Result
This company has taken the 100-plus molehills of spend data, turned it into a big data mountain and then used a spend analysis tool to turn that mountain into a planted field of information that is ripe for the picking of opportunities for strategic sourcing – and savings.
A properly executed strategic sourcing program can reduce total purchasing spend by well over 15 percent. “That’s money that any company can’t afford to leave on the table in these tough economic times,” Powers points out. “Several organizations across multiple industries regularly find themselves in a similar position. There’s a lot of talk about big data, and that continues to be how our clients typically define this problem. However, the true value is in the ability to gain intelligence from large amounts of data across many industries.”
Wally Powers is a director in West Monroe Partners’ operations excellence practice.
Tune in to hear from Chris Brown, Vice President of Sales at CADDi, a leading manufacturing solutions provider. We delve into Chris’ role of expanding the reach of CADDi Drawer which uses advanced AI to centralize and analyze essential production data to help manufacturers improve efficiency and quality.