Spectroscopy Since 1975
Metrohm Advertisement

Four Generations of Quality: key Quality Indicators and their impact on the associated environments

John P. Hammond

Technical Manager, Starna Scientific Limited, 52–54 Fowler Road, Hainault, Essex IG6 3UT, UK

DOI: https://doi.org/10.1255/sew.2021.a57
© 2021 The Authors
Published under a Creative Commons BY-NC-ND licence

Introduction

Having now established two of the principal tracks with respect to the quality systems in use, this article, the sixth in the series, focuses on some of the specific Quality “tools” in use in both the ISO and GxP environments; how these are defined, applied and used; and how these have evolved with time. In addition, the crossover, with respect to these specific concepts, is investigated by the use of the adoption/exchange in each environment, e.g. Qualification from GxP being implemented in ISO labs, and vice versa, the use of Expanded Uncertainty Budgets from ISO being used in updates to pharmaceutical standards etc. As we shall see, this process began in earnest at the end of the last millennium and continues, evolves and accelerates to this day.

The key quality “tools”: Traceability, Validation Qualification and Calibration (VQC) and Uncertainty of Measurement.

Traceability

Traceability is the process by which a given pathway can be established as an unbroken chain of events.

Traceability is a key concept in calibration and validation processes and is defined in ISO/IEC (International Organization for Standardization/International Electrotechnical Commission) Guide 99:2007 International Vocabulary of Metrology (VIM) as the:

“Property of a measurement result whereby the result can be related to a reference through a documented unbroken chain of calibrations, each contributing to the measurement uncertainty”

International agreements between National Metrology Institutes (NMIs) at a high level by the Bureau International des Poids et Mesures (BIPM) mean that measurements made against traceable references in one country will be accepted in any other country that is a signatory within the agreement.

Validation Qualification and Calibration (VQC)

The value of the chemical measurement depends upon the degree of confidence that can be placed on the result and thereby its “fitness for purpose”. If you couple this statement with the ISO and GxP Quality Standards, one irrefutable observation is that both have a common “crossover” requirement—effective equipment performance verification, often simply referred to as calibration. This requirement can be graphically shown as part of a series of concentric Validation/Qualification/Calibration (VQC) “shells”, where the Total Quality Management (TQM) outer framework is governed by either of these system tracks.

This structure affects both User and Vendor, and the sequential process shown in Figure 1 will depend upon one’s initial starting position. As a User, the overall perspective is planned before specific tasks are undertaken. As an instrument manufacturer, clearly establishing calibration to specification is the first quality requirement of a newly produced instrument.

Figure 1. VQC “shells”.

From the user’s viewpoint:

  1. Establish TQM protocols
  2. Formulate Validation plan
  3. Qualify instrument system
  4. Ensure initial (and maintain) calibration

From the vendor’s perspective:

  1. Ensure calibration to specification
  2. Assist User in the Qualification at the system location
  3. Assist/advise on additional Validation/TQM aspects

Uncertainty of measurement

In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.

For “justifiable” reasons, the pharmaceutical industry has resisted the requirement to establish an expanded Uncertainty budget for their own reference materials, even before ISO Guide 25 became ISO 17025 in 1999, but this may be changing, as discussed below.

“Pre-History”: the years before 1940

Traceability

In 1901, the National Bureau of Standards (NBS) in the USA was founded, and in 1905 they produced their first Standard Reference Material (SRM), an Iron reference for the smelting industry. In these years, these SRMs provided the primary source of these reference materials and provided the source for the sometimes latterly abused term of “NBS/NIST traceable” Certified Reference Materials (CRMs).

Uncertainty of measurement

If we broaden this requirement to essentially the use of statistics, whilst statistics arose from the interplay of mathematical concepts and the needs of several applied sciences including astronomy, geodesy, experimental psychology, genetics and sociology, around 1700, it emerged as a distinct and mature discipline around 1900.

1st Generation: the years between 1940 and 1975

Traceability

Three fundamental scale tools which underpin our measurement and calibration processes.

  • In 1948, William Meggers proposed that the metre be redefined based on wavelengths of green light produced by a mercury lamp he had developed, a method which was far superior to the metre bar in use at the time. In 1960, the scientific community did redefine the metre based on wavelengths of light, but they picked the orange-red light of the krypton lamp.
  • Harold Lyons and his colleagues at NIST built the world’s first atomic clock in 1949. Based on the frequency of the microwaves emitted by the ammonia molecule, the clock was not accurate enough to be used as a time standard, but it did prove the concept. Louis Essen at the UK’s National Physical Laboratory built the first atomic clock accurate enough to be a time standard in 1955.
  • In 1971, Robert Kamper and James Zimmerman of NIST Boulder proposed and demonstrated a new absolute thermometer based on the principle that a resistor (a device used to control electrical current) generates random noise from jiggling electrons, the magnitude of which depends only on the temperature of the resistor and a fundamental constant, the Boltzmann constant.

Uncertainty of measurement

More than 1000 pages long, the NIST Handbook of Mathematical Functions, colloquially known as Abramowitz and Stegun after its authors, was first published in 1964 and has been reprinted many times since. The Handbook is likely the most widely distributed and most cited NIST technical publication of all time. During the mid-1990s, the book was cited every 1.5 hours of each working day and reflects the acceleration in the use of this science, as described later.

2nd Generation: the years 1975 to 2000

As previously stated in this series of articles, from a personal perspective I am able to reflect on the changes over this time frame, having joined the Quality Assurance laboratories of a fine chemical manufacturer in 1975; and whilst my reflections may be “coloured” somewhat (no pun intended) by the analytical environment(s) that I have interacted with to this date, I believe they do provide an accurate perception of the science in use and/or being evolved during this period.

As a general consideration, during this period the first editions of many key reference(s) and/or standards were produced, which have been described in the earlier articles in this series. On further reflection, these timelines reflect the almost exponential take-off in the revision and update of these concepts in the last decade of the century, culminating in several key publications in the last few years. It is significant that it is at the end of this decade that we see the clarification of the Quality concepts, now familiar to the readers of this column, and in general use in a regulated environment. This continues into the 3rd generation, as these standards and references are revised into their second and third editions.

Traceability

Here are a few interesting examples of the expansion of this concept during this generation:

  • Made in the microgravity environment of the Space Shuttle Challenger during its maiden flight in April 1983, NIST standard reference material 1960 contained 5-mL vials of 10-µm polystyrene beads. The perfectly spherical, stable beads made for more consistent measurements of small particles like those found in medicines, cosmetics, food products, paints, cements and pollutants.
  • In 1984 as part of the revision which produced the 8th Edition of the Analar standards publication, in conjunction with his colleagues at the time, the Chief Chemist at Hopkin and Williams, D.J. Bucknell, produced a titrimetric schema. Unique at the time, this schema used traceability to show that the accuracy of fundamental standard volumetric solutions could be traced back to a primary solid, i.e. high-purity (99.9999 %) silver.
  • NIST produced the world’s first DNA profiling standard, SRM 2390, in 1992 at the request of the National Institute of Justice, the research arm of the US Department of Justice. Developed over the course of two years, SRM 2390 was made to test every step of the restriction fragment length polymorphism analysis method for identifying people using DNA.

Validation Qualification and Calibration (VQC)

Reflecting the earlier comment about acceleration of the process, multiple key contributions, previously referenced in Quality Matters columns, and associated primarily with the Qualification of instrument systems were published in the 1998 to 2000 period.

Uncertainty of measurement

From a personal perspective, during this period the book, Statistics for Analytical Chemistry,1 provided an invaluable tool for understanding the maths, and as previously stated, this revision matched the environment of the day, and it is interesting to read the published synopsis of the text:

“This third edition gives a clear and lucid account of the underlying principles of statistical methods. It reflects the enormous impact of microelectronics for the rapid calculation of chemometric procedures such as pattern recognition, optimization and numerical techniques. Significant changes and updates to most chapters of the previous edition have been made, particularly in curvilinear regression methods, robust statistical methods, multivariate methods, outliers in univariate statistics and calibration methods, initial data analysis, and experimental design and optimization.”

3rd Generation: the years 2000 to 2020

As stated earlier, with respect to concepts and application, this period can be described as one of consolidation, and at least in some areas harmonisation, c.f. ISO 17000 and ISO 9000 series standards, where the quality management system in the latest versions of these ISO 17000 series standards can be ISO 9001.

Also, history does appear to be repeating itself in the fact that in both the GxP and ISO environments, new related concepts were introduced at the end of the period, and as these will become significant discussion items moving forward these are discussed below in the current 4th generation.

Traceability

During the period 2001–2005 NIST worked on the convergence of many areas of laser physics, resulting in the production of an optical frequency comb, a laser specially designed to produce a series of very short (a few millionths of a billionth of a second), equally spaced pulses of light. These frequency combs dramatically simplified and improved the accuracy of frequency metrology and made it possible to build optical atomic clocks. In 2005, NIST/JILA fellow John L. (Jan) Hall shared half of the 2005 Nobel Prize in Physics for his “contributions to the development of laser-based precision spectroscopy, including the optical frequency comb technique”.

In 2016, ISO/REMCO published their Technical Report TR 16476 on “Reference materials — Establishing and expressing metrological traceability of quantity values assigned to reference materials”.

Validation Qualification and Calibration (VQC)

During this period, within the shell structure previously described, and specific to an analytical data generation environment, a pyramidal structure was also proposed and implemented. There are four critical components involved in the generation of reliable and consistent data (quality data). Figure 2 shows these components as layered activities within a quality triangle. Each layer adds to the overall quality. Analytical Instrument Qualification (AIQ) forms the base for generating quality data, and the other components essential for generating quality data are analytical method validation, system suitability tests and quality control check samples.

Figure 2. Components of data quality, redrawn and from United States Pharmacopeia (USP) <1058>.

This structure has been fully discussed in a previous article,2 so will not be expanded further here. However, unsurprisingly, it continues to evolve as discussed below in the 4th Generation.

Uncertainty of measurement

In November 2013, European regulators announced that they, along with the FDA, had released a second question-and-answer document intended to provide guidance to industry on the concept of QbD. The QbD concept is well-known within most regulatory circles. Simply stated, it is the belief that quality should be designed, not tested, into the final product, including its manufacturing processes. In theory (and regulators say in practice as well), this results in fewer compliance problems because a manufacturer addresses problems before they exist, and more systematically when they occur. In the pharmaceutical sector, QbD concepts are broadly incorporated into the regulatory systems of any region that uses the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). Both FDA and the European Medicines Agency (EMA) have been pushing QbD concepts heavily in recent years. Fundamental to this concept are accurate measurement processes, which by definition must contain an associated uncertainty evaluation.

4th Generation: from 2021 forward

Now, to bring our discussions up to date in both the ISO and GxP environments, the following key documents have been introduced, and are either in the process of being incorporated into the appropriate standard(s) or have already been implemented:

  • New regulations for ultraviolet and visible spectroscopy were introduced by the US and European pharmacopoeias. The revised US Pharmacopeia (USP) Chapter <857> became mandatory on 1 December 2019. Edition 10.0 of European Pharmacopoeia (EP) Chapter 2.2.25 became mandatory on 1 January 2020. Earlier versions of the pharmacopoeias described a limited set of generic tests to qualify an instrument for wavelength, absorbance, stray light and resolution (spectral bandwidth). If the instrument passed these tests, it could be described as “pharmacopoeia compliant”. This is no longer the case: these limited tests no longer qualify an instrument for the variety of measurements encountered in the modern pharmaceutical laboratory. Users must now demonstrate “fitness for purpose”, i.e. that the instrument has the capability to perform the actual analysis to the required accuracy and precision. The qualification measurements must, therefore, be made at parameter values that match or “bracket” those used in the analysis.
  • In the latest version of ISO/IEC 17025 there is introduced the concept (and evaluation of) risk, in the broadest of terms.
  • USP is undertaking an extensive review of its qualification processes, and in 2017 in Pharmaceutical Forum Volume 43, Issue 1 published an article “Stimuli to the Revision Process: Proposed New USP General Chapter: The Analytical Procedure Lifecycle <1220>” for public comment.

These evolutionary revisions introduce, and expand an array of new concepts such as Analytical Target Profile (ATP), Control Space, “Fitness for Purpose”, “Proof of Control”, Data Integrity etc.

An expansion and discussion of these concepts will be included in the next article in the series, and this article will mark a change in this series because from this point forward, we will concentrate on specific areas of interest in the Quality environment in which are/will be working in the future.

Now we are up to date, we have taken the DeLorean “Back to the Future”, and in a matching film sequel—next we will project into the future as in “Back to the Future—Part 2”.

References

  1. J.C. Miller and J.N. Miller, Statistics for Analytical Chemistry, 3rd Edn. Ellis Horwood/Prentice Hall (1993).
  2. C. Burgess and J.P. Hammond, “Modernisation of the spectroscopic General Chapters in the United States Pharmacopeia (USP)”, Spectrosc. Europe 27(1), 26–29 (2015). https://doi.org/10.1255/sew.2015.a1
 
John Hammond

John Hammond

John Hammond is an experienced analytical scientist, spectroscopist and technical marketing professional, skilled in the development, production and marketing of analytical systems into highly regulated and controlled industries. A Fellow of the Royal Society of Chemistry (FRSC), executive member of ISO/TC334 and an Expert Advisor to the United States Pharmacopeia, General Chapters, Chemical Analysis committee.
[email protected]

Rate this Article
No votes yet