Sustainable Pricing Data Analytics, Industry Today
Data and analytics: applying the 80/20 rule is key to successful pricing in manufacturing.

September 11, 2019

By Nicolas Magnette, Partner and Senior Solution Leader for B2B Pricing Solutions at Periscope® By McKinsey

The path to value creation with data in B2B pricing is riddled with pitfalls. Many companies, including manufacturing organizations, jump to technical solutions without defining clear use cases first. Later, they find out that the solution they chose does not meet their needs. Others are convinced that they don’t have enough data, or the right data, and end up doing nothing. Yet many companies do manage to create significant value with data-driven pricing, typically in the magnitude of two to seven percent margin improvement.

We found that the winners succeed by applying the 80/20 rule in multiple respects. They size the prize up-front and focus on the use cases with the potential to make a real difference. They make a realistic assessment of data quality and implementation feasibility at the outset. Lastly, they’re prepared to augment and enrich existing data sources in creative ways. Building on this pragmatic spirit, leaders continuously refine and improve their data regime to create sustainable impact over time.

Following are the dos and don’ts for successful data-driven pricing efforts:

Size the Prize

Before you start in-depth data gathering or buy software for data-driven pricing, ensure you know where the value lies for your company. Involve business owners from day one to ensure buy-in for the effort. Develop hypotheses on the most promising business improvement opportunities that could be addressed with data and analytics.

Proven tools, such as the pocket margin waterfall, can help pinpoint leakage in specific business areas as well as the main drivers of value. In many cases, you’ll have to revisit your ingoing assumptions about the biggest opportunity. For example, one manufacturer went into a data-driven pricing effort assuming that shipping and handling charges would be the biggest value lever. However, a population of the pocket margin waterfall with high-level data quickly revealed that shipping cost was insubstantial as a value lever, relative to list prices and discounts. Thanks to the waterfall, the initial diagnostic only took a few weeks and helped the company focus its pricing efforts on the best use cases and biggest opportunities.

Lay the Groundwork

An initial assessment of data availability and implementation feasibility is crucial to get a sense of the challenge that the prioritized use case presents. Start by clarifying what data is required for the use case(s)selected. Specify the necessary level of detail, most important splits, and frequency of updates that may be required in the future to keep the model running. Focus on the data you need to answer the most important business questions. In the absence of external data, manufacturers can use their internal data. For example, changes in discounts granted by the sales representatives can be used to indicate changes in competitors’ prices.

Augment and Enrich Existing Data Sources

Once you have created an initial data inventory for your prioritized use cases, combine quantitative tools, like statistical analysis of data completeness and usability, with qualitative sources, like interviews, to get a sense of the quality. Talk to experts in IT, finance, sales and the legal department to identify potential issues in areas such as metadata quality, format compatibility and applicable privacy regulation. In some cases, you may not be allowed to use the data you have in the way you intend to or to combine data from different sources.

To conclude preparations, conduct a feasibility assessment of the prioritized use cases, looking at three dimensions:

  • Risk
  • Required resources and investments
  • Expected time to impact

Roll up Your Sleeves

Now it’s time to dive in. Make sure all relevant stakeholders and contributors are on the same page about the deliverables, overall timeline and immediate priorities. Formulate a clear strategy for data cleaning, integration and validation. As you build the model, test and refine it in iterations. Create a strong link between data collection and decision making.

The key to sustainable success? The comprehensive and consistent application of the 80/20 rule. Don’t strive for analytical perfection. Don’t aspire to complete coverage. Most importantly, don’t treat data as an end in itself. Data is a means to an end, and that end is value creation.

You don’t need perfect data or a perfect model. You need a model that provides reliable answers to your most pressing business questions. Consequently, the way you work with data and analytics should follow your business priorities. Once you have gotten started, you can and should keep working towards a more refined, holistic, and powerful data regime with improved data quality, continuous capability building and increasingly systematic application of data-driven decision making. As long as you keep the 80/20 rule in mind, and let it guide your decisions, you’ll reap disproportionate benefits all along the value chain.

Nicolas Magnette Periscope, Industry Today
Nicolas Magnette

Nicolas Magnette leads the B2B Pricing Solutions portfolio for Periscope® By McKinsey. In his role he is responsible for the global solution capability development and successful deployments to drive client impact. Nicolas joined Periscope® in 2011 and has since then supported more than 50 commercial transformation programs for B2B clients, mainly in chemicals, medical devices, and manufacturing. His focus has been on advanced analytics in pricing, front-line capability building and the deployment of commercial and pricing performance management systems. Nicolas has a passion for designing efficient and user-friendly products and has pioneered many of the innovations in Periscope®.

Before joining Periscope® By McKinsey, Nicolas was a generalist consultant at McKinsey, in Luxembourg and in South Africa. 

Previous articleHow to be Efficient When Disconnected
Next articleEarly Downtime Detection Saves Money