Get a quote
Home Resources Blog September 2022

Quality 4.0 – Data and Analytics

29 September 2022
Martin Graham, NQA's Principle Assessor for Quality continues his series on Quality 4.0 and what it means for the future of Quality Management…

In my previous blog I wrote about the principles and context of Quality 4.0. As a quick reminder, in recent times we have seen rapid advances in connectivity, mobility, analytics, scalability and data resulting in what has been called the ‘fourth industrial revolution’, or Industry 4.0.

This fourth industrial revolution has digitised operations and resulted in transformations in manufacturing and operational efficiency, supply chain performance and monitoring, innovation and in some cases enabled entirely new business models to be developed.

Quality 4.0 is seeking to align ‘traditional’ quality and systems management with ‘Industry 4.0’ principles.

Quality 4.0 is not about technology, but the users of that technology and the processes they use to maximise value, efficiency and effectiveness.

In this series I will be delving into the elements of quality 4.0 in a bit more detail, starting with Data and Analytics.

Data

Data driven decision making has been a fundamental principle of quality management and improvement for a very long time. Recent updates in management system standards support and emphasise the relevance and importance of evidence-based decision making.

Despite this, many organisations continue to struggle with data and evidence in a practical sense when it comes to quality.

There are five areas to consider when it comes to data:

  1. Volume: Management systems have, traditionally, a large number of data records e.g. KPIs, corrective actions, incident responses, QC data etc.). However the amount of data potentially sourced from connected system devices is much larger. This requires consideration as to how the data collection and sorting will be approached.

  2. Variety: There can be considered three types of data within a system. Structured, semi-structured and unstructured. The sources of these can vary and as such we need to establish the type of data being collected and if this required refinement and structuring in order to be considered reliable.

  3. Velocity: This is rate at which  a system generates and an organisation gathers data. Connected systems and devices tend to provide data at a much greater velocity than traditional  systems.

  4. Veracity: This is the accuracy of data. Traditional quality system data may be considered a low veracity due to disconnected systems and a lack of automation and or validation.

  5. Transparency: The degree to which data is ‘visible’ within a system. Data should be accessible no matter the source or where it is stored, and systems should be set up to ensure a common data model is applied.

Analytics

Data is redundant without effective analysis. Effective analysis, despite being a core requirement of a quality management system, remains a pitfall for many organisations.

Without effective analysis, the achievement of objectives becomes very difficult if not impossible.

There can be a reliance on lagging metrics and a distinct lack of real time data within a system.

We can think of analysis in four main areas:

  • Descriptive (what happened?);

  • Diagnostic (why  did it happen?);

  • Predictive (what will happen?);

  • Prescriptive (what actions will be taken?).

Companies working towards Quality 4.0 should be sure to build their analyst strategy and policy to ensure that insights are realised whilst also considering the use of AI/ML to complement traditional analysis in considering the above four areas.