written by Maree Stuart
Method selection is one area that many labs take for granted. You often only have one choice…….or so you think. After all, when you have a hammer, everything is a nail, right?
In this article we’ll explore some of the questions labs should be asking so that the right method is used for the right problem.
What’s required in method selection?
Let’s start with what ISO/IEC 17025 has to say about method selection.
As the “oracle” of good laboratory practice, you would think a good place to start is with ISO 17025, or ISO 15189 if you’re working in a medical lab.
Clause 18.104.22.168 of ISO/IEC 17025 states (with our emphasis):
The laboratory shall use appropriate methods and procedures for all laboratory activities and, where appropriate, for evaluation of the measurement uncertainty as well as statistical techniques for analysis of data.
We have to use “appropriate methods”. What does that mean?
ISO/IEC 17025 goes on to require (in clause 22.214.171.124):
When the customer does not specify the method to be used, the laboratory shall select an appropriate method and inform the customer of the method chosen. Methods published either in international, regional or national standards, or by reputable technical organizations, or in relevant scientific texts or journals, or as specified by the manufacturer of the equipment, are recommended. Laboratory-developed or modified methods can also be used.
For those of you playing along in the medical world, the companion clause in ISO 15189 is 7.3.1 which starts with the requirement of:
The laboratory shall select and use examination methods which have been validated for their intended use to assure the clinical accuracy of the examination for patient testing.
Hmmmm, not a lot of guidance in those opening words but at least it gives a clue that labs need to think about the intended use in terms of clinical accuracy.
We can extrapolate the requirement outlined in ISO 15189 to create a broader principle. Put simply this is, when selecting a method, labs need to select and use a method that deals with the question of the intended use of results. That means the selection of the method relies on an understanding of how the results of testing or calibration will be used. That understanding comes from communication with clients and other stakeholders, such as regulators.
A few years back, we were all abuzz (see what I did there ?) with adulterated honey and the unreliable test methods used to identify “impure” honey. Labs were using a method, the official honey or C4 test, that did not accurately test for substances that should not be found in pure honey. Clearly, something had gone wrong with selecting the right method for the job as the technology was readily available to detect these impurities.
As Honeygate demonstrates, laboratory testing is an integral part of scientific research, quality control, and diagnosis. Selecting the appropriate test method is crucial to obtaining accurate and reliable results. With numerous test methods available, it’s essential to follow a systematic approach to ensure the method chosen aligns with the research, analytical, or diagnostic objectives.
What are the steps in method selection?
Step 1: Define the Testing Objective
The first step in selecting a laboratory test method is to clearly define the testing objective. Determine the specific question you aim to answer or the problem you seek to address. Understanding the purpose of the test will guide you in choosing the most relevant method that aligns with your research, analytical, or diagnostic goals.
For this step to produce a good outcome, it is vital for labs to engage with clients, specifiers, and other stakeholders to understand the problem that testing seeks to solve. Getting to the “Why” of testing or calibration will reveal the goals.
Step 2: Review Existing Literature and Standards
Before developing or selecting a new test method, review existing literature and standards. Many well-established organisations and scientific bodies publish standardised test methods for various analyses. These methods have undergone rigorous validation and are widely accepted. Utilizing standardised methods can enhance the credibility and reproducibility of your results.
But, as Honeygate demonstrates, just because it’s an industry standard doesn’t make it right. That’s why step 1 is so important.
Step 3: Assess Equipment and Resources
Evaluate the laboratory’s available equipment, expertise, and resources. Some test methods may require specialised instruments or highly skilled personnel. Consider the cost and feasibility of acquiring any necessary equipment or expertise for the selected method. It is essential to ensure that the laboratory has the capability to conduct the chosen test or calibration accurately.
Step 4: Evaluate Test Method Characteristics
Understand the characteristics of each potential test method. Consider factors such as sensitivity, specificity, accuracy, precision, and the detection limit. We’ve written previously on these factors. Assess how well these characteristics align with your testing objectives and the sample type you will be working with. For instance, if you need high sensitivity for trace analysis, choose a method that offers superior sensitivity.
Step 5: Analyse Sample Matrix and Composition
The nature of the sample matrix (solid, liquid, gas) and its composition can significantly impact the choice of a test method. Some methods may be more suitable for complex matrices, while others work better for simple or homogeneous samples. Ensure that the selected method is compatible with the properties of your sample.
Step 6: Consider Time and Throughput
Evaluate the turnaround time and throughput required for testing. Some methods are time-consuming and may not be suitable for high-throughput applications. Conversely, certain rapid test methods may sacrifice accuracy. Strike a balance between efficiency and precision based on your specific needs and those of your clients.
Step 7: Validate and Verify
Before implementing a new test method, it’s crucial to validate and verify its performance. Conducting a validation study ensures that the method consistently produces accurate and reliable results under specific conditions. Verification, on the other hand, confirms that the method is suitable for the intended use and aligns with the laboratory’s requirements.
Step 8: Monitor and Review
Once a test method is in use, it’s essential to regularly monitor its performance and review the results obtained. Identify any potential issues or deviations and take corrective actions promptly. Continuous monitoring and review help maintain the quality and effectiveness of the laboratory’s testing processes.
The Final Word
Selecting the appropriate laboratory method is a critical step in any process involving testing or calibration.
By following a systematic approach users of lab data, such as researchers, regulators, and other professionals, can make well-informed decisions. Additionally, considering aspects such as sample matrix, time, and throughput, are essential for ensuring the right method is selected.
A carefully selected test or calibration method enhances the credibility of research findings and supports high-quality analytical testing and diagnostic outcomes.
So next time someone asks you to perform a test or calibration, pause to think about why you’re performing that work and how the results will be used.
Got questions about method selection?
We understand that this is not the kind of thing that you spend your time on. We can help you develop a strong system that will make sure that your work is not for nothing, and that your clients remain happy. Give Maree a call at 0411 540 709, or email us at firstname.lastname@example.org if you need that discrete conversation.