Manufacturing organizations are facing a massive influx of data and as such are increasing investment in ways to manage volume at velocity.

Manufacturing   Robotics   2, Industry Today

Its no secret that businesses across all industry sectors are seeing a significant increase in the volume, velocity, and variety of data feeding into their organizations. The challenge of managing these ever-increasing data volumes has only been exacerbated over the last year, due to the additional strain of the global pandemic and the sudden shift to new ways of working.

The manufacturing industry in particular has been significantly impacted – according to recent research,  over 50 percent of firms surveyed in the manufacturing sector have reported a significant increase in multiple data types, including eCommerce data, customer data, and data from sensors. In the survey, manufacturing firms reported significant increases in investment in the following technologies:

  • Cloud computing (50% of firms significantly increased investment across 2020)
  • Data analytics (45% of firms Significantly increased investment across 2020)
  • Automation (43% of firms Significantly increased investment across 2020)
  • Machine learning and AI (42% of firms significantly increased investment across 2020)

What this tell us is that manufacturers understand that faster access to better quality data is critical to improving operations and lowering costs. And while this is a good starting point, manufacturers still have work to do when it comes to implementing a process of “continuous intelligence” – where real-time and historical data are continuously combined for real-time analysis – to yield rapid, accurate, machine-driven decision making.

A key hurdle is in getting technology and processes aligned to enable the operational shift to continuous intelligence. Let’s take a closer look at what manufacturing organizations can do to make this shift a reality, and start realizing the full benefits that real-time analytics can offer. 

Be Aware of the Pitfalls of Legacy Technologies

Ingesting data fast enough to support analysis and decision making is a critical requirement for continuous intelligence. However, capturing, managing, and analyzing data at speed can push legacy technologies to their limit. For example, where in the past periodic samples were taken to detect quality issues, manufacturers are now looking to monitor processes and tools 24×7 – simply being able to capture and store measurement data is a sizeable task in and of itself. Add to this the analysis piece, and existing systems will quickly struggle.

Ensuring data is of sufficient quality for analysis and decision making is critical and can have health, safety, and production implications. Sensors and machines can malfunction, data can be late or out-of-order. You need methods in place to continuously check and address anomalies in data – and do that quickly enough so you can take preventative and/or corrective action.

Another challenge around legacy technologies is access to data. One of the largest complaints and challenge from engineers, data scientists and technicians is getting access to raw unfiltered data from tools. Legacy tools place restrictions on data access so as not to impact operations of the system, and if access is provided, it is often many hours – up to even days – after the fact. Again, for an operating model of continuous intelligence to be successful, access to data must be near instantaneous and constant.

Analyze Data Insights at the Edge

Once the data is collected, the challenge is deriving meaningful value to detect anomalies, make predictions, and make recommendations for improving real-time operations. Even if an organization has the data, if the analytics process takes many hours to days to execute, the value of that data will quickly degrade, causing the organization to miss a window of opportunity that could allow for competitive differentiation.

As manufacturing processing accelerates, so does the need to make decisions in milliseconds. Making assessments and taking action – such as determining whether a part has to be rejected – has to be done at the speed of the process. To achieve those speeds, data processing and analytics must occur close to where the action is, often at the edge. Moving data to the cloud and back adds too much latency to the process and presents risks to production facilities if access to the Internet were disrupted.

Getting Started with Data Analytics

Given the challenges outlined above, making the right choice on a real-time analytics partner is critical.

Firstly, consider how a real-time analytics solution might slot into the existing data environment. It’s unlikely that many firms will rip and replace their data software stack, so solutions must be able to be swiftly and easily incorporated into existing systems. Ideally, they should be compatible with the major cloud platforms and computing architectures, and interoperable with popular programming languages such as Python to make use of analytical and machine learning service providers.

Additionally, software platforms should have the ability to be deployed at the edge, close to where data is generated near the tool or in the field, as well as on-premises and in cloud infrastructure to support workloads and use cases where latency matters, while simultaneously providing real-time consolidated views across assets, processes, factories, and locations.

Security will also be important – its recommended that manufacturing organizations work with reputable service providers that can install and configure technology in way that will not put their organization at risk while enabling them to scale.

Ongoing maintenance and operational costs are other factors to account for, along with the level of professional services that are available to support the analysis, remediation, and migration of data. Businesses may also want to look at the experience that exists within the organization, to see if the appropriate skill sets exist or whether training and hiring policies need to be updated.

When done effectively, harnessing data to understand and optimize machine use and maintenance can set manufacturing organizations apart from their competition. Whether navigating repair schedules for a fleet of machines to ensure they are optimized for large production runs or understanding even the minor operational corrections with a machine, real-time data comprehension and analysis is key to maximizing operations. Unlocking continuous intelligence requires moving beyond the age siloed data management with batch analysis of historical data, and into an era where organizations bring together real-time and historical data from across the business for analysis ‘in the moment’ to drive faster, smarter decision making.

Przemek Headshot, Industry Today
Przemek Tomczak

About the Author
Przemek Tomczak, Senior Vice-President IoT and Utilities at KX and First Derivatives plc, where he leads the internet of things and utilities industry verticals globally. Przemek has over 24 years IT and business leadership experience, implementing and operating big data and analytics systems, delivering program and transformation initiatives, consulting, outsourcing and risk management in the energy and utility and other industries.

Previous articleHow Construction Firms Can Become More Sustainable
Next articleJune 17, 2021