What is the Cost of Poor Quality?
Part 2 (of a 4 part series)
So, what is the cost of poor quality? Placing the cost of poor quality in context to all stakeholders (particularly, stakeholders in IT) requires that you be able to turn to the right data at the right time to make your business case. The cost of poor quality (i.e., scrap, rework, returns, external failures, and so on) can give upper-level decision makers a stark reminder of the cost of neglecting quality from an enterprise point of view. As such, these metrics are critical to making your business case, but they should not overshadow other variables in the overall cost of quality equation.
Contextualizing Cost of Poor Quality
The costs of poor quality are very high-profile, and speaking broadly, they are well understood by researchers, for the most part. The hard statistics vary from sector to sector, but the overall trend is clear: cost of quality rises substantially as defects come to light closer to the consumer. Conversely, cost of quality is most favorable as quality issues surface earlier in the value chain. The need to mitigate nonconformances as early as possible is clear.
IT’s role in enabling timely resolution of quality concerns is key to your organization’s quality management success. Without the ability to sift through accurate historical data to pinpoint root causes, any quality management professional would struggle to identify and mitigate issues consistently before products move through the value chain. Unfortunately, it is at this point where the shortcomings and limitations of your company’s current IT infrastructure and software suite may come into play, and the real-world situation is too often very less than ideal.
In recent times, research by the Aberdeen Group has shown that a significant amount of organizations still struggle to effectively measure quality metrics. You may view this finding as contradictory since your organization has likely made very substantial investments in new technology and IT architecture over the last few years, specifically to address this issue. Your organization’s cost of quality metrics tie directly into IT’s ability to deliver timely and accurate data to the right personnel before products move on to manufacturing and downstream to the consumer.
IT Sprawl As a Real Phenomenon
As a quality management professional, you should never forget that IT sprawl is a very real phenomenon, particularly among large enterprises with multiple manufacturing sites and offices around the world. Today, critical quality management related data may reside in fragmented silos of data sources, enterprise applications and proprietary (i.e., expensive to maintain) solutions. Integration is absolutely critical to success in today’s leaner manufacturing environment.
Integrating an IT architecture of such complexity is a daunting task, to say the least. Truly, your organization may be very efficient at collecting and storing financial data and quality management related data, but consolidating additional disparate data sources in the face of emerging quality management issues leaves much to be desired in today’s manufacturing environment. In the end, your company may rely on wholly paper-driven processes to pinpoint and escalate quality management concerns to upper-level decisions makers, which allows inefficiencies to dilute quality management processes overall.
Improving the Accuracy of Data Tied to the Cost of Poor Quality
In one worst-case scenario, less than ideal IT systems may actually be the root cause of quality issues, especially with respect to an extended supply chain. The fact of the matter is that your organization may not have the most optimal IT architecture in place to collaborate among supply chain partners very effectively – if at all. Your company may be missing a very important piece of the quality management puzzle: real-time visibility into supplier quality concerns.
As such, your organization may not be measuring cost of quality as well as you might assume. Large, incredibly complex enterprise software systems require careful implementation (and proactive maintenance) to ensure that all components of the software stack can coexist peacefully and at a reasonable cost to executives.
Context is key when discussing the cost of poor quality to stakeholders in IT since these data often reside in silos of applications that are not inherently interoperable. For example, your enterprise’s software stack may include multiple ERP or MOM instances among disparate manufacturing sites as the result of mergers and acquisitions activity. In a data-saturated manufacturing enterprise, critical quality management intelligence may fall between the cracks of your organization’s disjointed corrective and preventative actions processes, thus having an adverse effect on the cost of poor quality overall.
Today, data quality is a mission-critical endeavor. Without timely, accurate, and ideally, real-time visibility into quality management issues throughout the value chain, your organizations may see minimal gains in cost of poor quality metrics. An integrated quality management system is one strategy to consider in the face of such challenges. So, to add to the conversation, the next part of this series will delve into the cost of good quality.