One of the biggest challenges to effectively implementing (and getting a return on your investment for) a Six Sigma program is ensuring that you have the right data, and enough of it, to pinpoint your options for eliminating variance and therefore defect within your process.
The application of big data and predictive analytics dynamically shifts the way we look at identifying defects and exploring root causes. The integration of the Internet of Things (IOT) enables companies to collect not only common manufacturing metrics (cycle time, changeovers, utilization) but also the contextual data that describes how these conditions were met (ambient temperature, operator, position). When all of this data, typically found across several disparate systems, is brought together, a truly dynamic and thorough model can be created.
The compilation of data can then be used to investigate and model Six Sigma improvements to the process and the conditions within which the process is being applied. This much more holistic view of defect reduction can help uncover the simplest of changes that can have a very profound and lasting impact on manufacturing performance.
Imagine being able to identify that by increasing your ambient temperature in the inventory warehouse a few degrees and decreasing your lot sizes for processing by half that you’d be able to decrease your tool wear by 10 percent. Now imagine being able to test this scenario from your office before rolling it out on the manufacturing floor.
This integration of predictive analytics and the more traditional process improvement tools that are the cornerstone to Six Sigma practices create a much more holistic and complete view of performance improvement.
There are a few key building blocks that must be in place for companies to reach this level of integration and application of Six Sigma and big data tools.
1. Data inputs – from the right areas in and around the process, using standard Six Sigma approaches
2. Data integration – providing the context and links between dynamic data elements (and sources), typically utilizing a big data tool or platform
3. Data retention – developing a (secure) historic block of data that can be utilized to build and test improvement models
4. Six Sigma methodologies – applying the tried and true practices for identifying root causes and the process of testing and applying them in both a digital and physical world
Many fear that the momentum behind big data is going to push the more traditional tools for improvement out of our manufacturing institutions. However, if the tools are viewed as complementary, the enablement of big data can serve to revitalize and increase the importance of a strong Six Sigma practice for application and continuous improvement.
The manufacturing plant of tomorrow is not only one that depends on robotics and automation – it is also one where optimization, defect reduction and data modeling can make or break a company. The integration of big data and its application to improving processes (and the broader context and conditions surrounding their execution) with the traditional tools of Six Sigma will help companies to truly thrive in this digital world.
Enjoy curated articles directly to your inbox.