METHOD PERFORMANCE AND METHOD VALIDATION Table 1. 2. Figures of Merit for Instruments or Analytical Methods Parameter I Accuracy Deviation from true value 2 Precision Reproducubility of replicate measurements concentration 4 Detection limit Lowest measurable concentration 5 Linear dynamic range Linear range of the calibration curve 6 Selectivity Ability to distinguish the analyte from interferances 7 Speed of analysis Time needed for sample preparation and analysis 8 Throughpu Number of samples that can be run in a given time Ease of automation How well the system can be automated Durability of measurement, ability to handle adverse conditions 11 Porta bi ability to move instrument around officiency in terms of waste generation and energy consumption 13 Cost Equipment cost + cost of supplies +labor cost 1.3. 1. Sensitivity The sensitivity of a method(or an instrument) is a measure of its ability to distinguish between small differences in analyte concentrations at a desired confidence level. The simplest measure of sensitivity is the slope of the cali bration curve in the concentration range of interest This is referred to as the calibration sensitivity. Usually, calibration curves for instruments are linear and are given by an equation of the form mc+ Shl where S is the signal at concentration c and sbl is the blank (i.e, signal in the absence of analyte). Then m is the slope of the calibration curve and hence the sensitivity. When sample preparation is involved, recovery of these steps has to be factored in. For example, during an extraction, only a fraction proportional to the extraction efficiency r is available for analysis. Then equation(1. 11)reduces to S=mrc+stbl (1.12) Now the sensitivity is mr rather than m. The higher the recovery, the higher the sensitivity. Near 100% recovery ensures maximum sensitivity. The
1.3.1. Sensitivity The sensitivity of a method (or an instrument) is a measure of its ability to distinguish between small di¤erences in analyte concentrations at a desired confidence level. The simplest measure of sensitivity is the slope of the calibration curve in the concentration range of interest. This is referred to as the calibration sensitivity. Usually, calibration curves for instruments are linear and are given by an equation of the form S ¼ mc þ sbl ð1:11Þ where S is the signal at concentration c and sbl is the blank (i.e., signal in the absence of analyte). Then m is the slope of the calibration curve and hence the sensitivity. When sample preparation is involved, recovery of these steps has to be factored in. For example, during an extraction, only a fraction proportional to the extraction e‰ciency r is available for analysis. Then equation (1.11) reduces to S ¼ mrc þ stbl ð1:12Þ Now the sensitivity is mr rather than m. The higher the recovery, the higher the sensitivity. Near 100% recovery ensures maximum sensitivity. The Table 1.2. Figures of Merit for Instruments or Analytical Methods No. Parameter Definition 1 Accuracy Deviation from true value 2 Precision Reproducubility of replicate measurements 3 Sensitivity Ability to discriminate between small di¤erences in concentration 4 Detection limit Lowest measurable concentration 5 Linear dynamic range Linear range of the calibration curve 6 Selectivity Ability to distinguish the analyte from interferances 7 Speed of analysis Time needed for sample preparation and analysis 8 Throughput Number of samples that can be run in a given time period 9 Ease of automation How well the system can be automated 10 Ruggedness Durability of measurement, ability to handle adverse conditions 11 Portability Ability to move instrument around 12 Greenness Ecoe‰ciency in terms of waste generation and energy consumption 13 Cost Equipment cost þ cost of supplies þ labor cost method performance and method validation 13
SAMPLE PREP ARATION: AN ANALYTICAL PERSPECTIVE that arises from total contribution from sample preparation and anali ank blank is also modified by the sample preparation step: Srb refers to the bla Since the precision decreases at low concentrations, the ability to dis- tinguish between small concentration differences also decreases. Therefore sensitivity as a function of precision is measured by analytical sensitivit which is expressed as 4 s (1.13) where ss is the standard deviation based on sample preparation and analysis Due to its dependence on Ss, analytical sensitivity varies with concentration 1.3.2. Detection Limit The detection limit is defined as the lowest concentration or weight of ana- lyte that can be measured at a specific confidence level. So, near the detec- tion limit, the signal generated approaches that from a blank. The detec tion limit is often defined as the concentration where the signal/noise ratio reaches an accepted value(typically, between 2 and 4). Therefore, the smallest distinguishable signal. S,m, is Sm= xslt ksrbl (1.14) where, Xbl and sbl are the average blank signal and its standard deviation. The constant k depends on the confidence level, and the accepted value is 3 at a confidence level of 89%. The detection limit can be determined experi mentally by running several blank samples to establish the mean and stan dard deviation of the blank. Substitution of equation(1. 12) into(1. 14)and Cn=5=5 where Cm is the minimum detectable concentration and Sm is the signal obtained at that concentration. If the recovery in the sample preparatio step is factored in, the detection limit is given as n=m-3 Once again, a low recovery increases the detection limit, and a sample preparation technique should aim at 100% recovery
blank is also modified by the sample preparation step; stbl refers to the blank that arises from total contribution from sample preparation and analysis. Since the precision decreases at low concentrations, the ability to distinguish between small concentration di¤erences also decreases. Therefore, sensitivity as a function of precision is measured by analytical sensitivity, which is expressed as [4] a ¼ mr ss ð1:13Þ where ss is the standard deviation based on sample preparation and analysis. Due to its dependence on ss, analytical sensitivity varies with concentration. 1.3.2. Detection Limit The detection limit is defined as the lowest concentration or weight of analyte that can be measured at a specific confidence level. So, near the detection limit, the signal generated approaches that from a blank. The detection limit is often defined as the concentration where the signal/noise ratio reaches an accepted value (typically, between 2 and 4). Therefore, the smallest distinguishable signal, Sm, is Sm ¼ Xtbl þ kstbl ð1:14Þ where, Xtbl and stbl are the average blank signal and its standard deviation. The constant k depends on the confidence level, and the accepted value is 3 at a confidence level of 89%. The detection limit can be determined experimentally by running several blank samples to establish the mean and standard deviation of the blank. Substitution of equation (1.12) into (1.14) and rearranging shows that Cm ¼ sm stbl m ð1:15Þ where Cm is the minimum detectable concentration and sm is the signal obtained at that concentration. If the recovery in the sample preparation step is factored in, the detection limit is given as Cm ¼ sm stbl mr ð1:16Þ Once again, a low recovery increases the detection limit, and a sample preparation technique should aim at 100% recovery. 14 sample preparation: an analytical perspective
METHOD PERFORMANCE AND METHOD VALIDATION 1.3.3. Range of Quantitation The lowest concentration level at which a measurement is quantitatively meaningful is called the limit of quantitation(LOQ). The LOQ is most often defined as 10 times the signal/noise ratio. If the noise is approximated as the standard deviation of the blank, the LoQ is(10 x SrbI). Once again, when the recovery of the sample preparation step is factored in, the loQ of the overall method increases by 1/r For all practical purposes, the upper limit of quantitation is the point where the calibration curve becomes nonlinear. This point is called the limit of linearity(LoL). These can be seen from the calibration curve presented in Figure 1.3. Analytical methods are expected to have a linear dynamic range (LDR) of at least two orders of magnitude, although shorter ranges are also acceptable Considering all these, the recovery in sample preparation method is an important parameter that affects quantitative issues such as detection limit, sensitivity, LOQ, and even the LOL. Sample preparation techniques that enhance performance(see Chapters 6, 9, and 10) result in a recovery (r) larger that 1, thus increasing the sensitivity and lowering detection limits 1.3.4. Other Important Parameters There are several other factors that are important when it comes to the selection of equipment in a measurement process. These parameters are items 7 to 13 in Table 1. 2. They may be more relevant in sample preparation than in analysis. As mentioned before, very often the bottleneck is the sam- ple preparation rather than the analysis. The former tends to be slower; consequently, both measurement speed and sample throughput are deter mined by the discrete steps within the sample preparation. Modern ana- lytical instruments tend to have a high degree of automation in terms of autoinjectors, autosamplers, and automated control/data acquisition. On the other hand, many sample preparation methods continue to be labor intensive, requiring manual intervention. This prolongs analysis time and introduces random/systematic errors A variety of portable instruments have been developed in the last decade orresponding sample preparation, or online sample preparation method are being developed to make integrated total analytical systems. Many ample preparation methods, especially those requiring extraction, require solvents and other chemicals Used reagents end up as toxic wastes, whose sposal is expensive. Greener sample preparation methods generate less spent reagent. Last but not the least, cost, including the cost of equipment, labor, and consumables and supplies, is an important factor
1.3.3. Range of Quantitation The lowest concentration level at which a measurement is quantitatively meaningful is called the limit of quantitation (LOQ). The LOQ is most often defined as 10 times the signal/noise ratio. If the noise is approximated as the standard deviation of the blank, the LOQ is ð10 stblÞ. Once again, when the recovery of the sample preparation step is factored in, the LOQ of the overall method increases by 1=r. For all practical purposes, the upper limit of quantitation is the point where the calibration curve becomes nonlinear. This point is called the limit of linearity (LOL). These can be seen from the calibration curve presented in Figure 1.3. Analytical methods are expected to have a linear dynamic range (LDR) of at least two orders of magnitude, although shorter ranges are also acceptable. Considering all these, the recovery in sample preparation method is an important parameter that a¤ects quantitative issues such as detection limit, sensitivity, LOQ, and even the LOL. Sample preparation techniques that enhance performance (see Chapters 6, 9, and 10) result in a recovery ðrÞ larger that 1, thus increasing the sensitivity and lowering detection limits. 1.3.4. Other Important Parameters There are several other factors that are important when it comes to the selection of equipment in a measurement process. These parameters are items 7 to 13 in Table 1.2. They may be more relevant in sample preparation than in analysis. As mentioned before, very often the bottleneck is the sample preparation rather than the analysis. The former tends to be slower; consequently, both measurement speed and sample throughput are determined by the discrete steps within the sample preparation. Modern analytical instruments tend to have a high degree of automation in terms of autoinjectors, autosamplers, and automated control/data acquisition. On the other hand, many sample preparation methods continue to be laborintensive, requiring manual intervention. This prolongs analysis time and introduces random/systematic errors. A variety of portable instruments have been developed in the last decade. Corresponding sample preparation, or online sample preparation methods, are being developed to make integrated total analytical systems. Many sample preparation methods, especially those requiring extraction, require solvents and other chemicals. Used reagents end up as toxic wastes, whose disposal is expensive. Greener sample preparation methods generate less spent reagent. Last but not the least, cost, including the cost of equipment, labor, and consumables and supplies, is an important factor. method performance and method validation 15
16 SAMPLE PREPARATION: AN ANALYTICAL PERSPECTIVE 1.3.5. Method validation Before a new analytical method or sample preparation technique is to be implemented, it must be validated. The various figures of merit need to be determined during the validation process. Random and systematic errors are measured in terms of precision and bias. The detection limit is established for each analyte. The accuracy and precision are determined at the concen tration range where the method is to be used. The linear dynamic range is established and the calibration sensitivity is measured. In general, method validation provides a comprehensive picture of the merits of a new method and provides a basis for comparison with existing method a typical validation process involves one or more of the following steps Determination of the single operator figures of merit. Accuracy, precision detection limits, linear dynamic range, and sensitivity are determined. Analysis is performed at different concentrations using standards. Analysis of unknown samples. This step involves the analysis of sam ples whose concentrations are unknown. Both qualitative and quanti tative measurements should be performed. Reliable unknown samples are obtained from commercial sources or governmental agencies as ertified reference materials. The accuracy and precision are determined Equivalency testing. Once the method has been developed, it is com pared to similar existing methods. Statistical tests are used to determine if the new and established methods give equivalent results. Typical tests include Students t-test for a comparison of the means and the F-test for of Collaborative testing. Once the method has been validated in one labo- ratory, it may be subjected to collaborative testing. Here, identical test samples and operating procedures are distributed to several labo- ratories. The results are analyzed statistically to determine bias and interlaboratory variability. This step determines the ruggedness of the method Method validation depends on the type and purpose of analysis example, the recommended validation procedure for PCR, followed by illary gel electrophoresis of recombinant DNA, may consist of the following ste Compare precision by analyzing multiple(say, six)independent repli cates of reference standards under identical conditions 2. Data should be analyzed with a coefficient of variation less than a specified value(say, 10%)
1.3.5. Method Validation Before a new analytical method or sample preparation technique is to be implemented, it must be validated. The various figures of merit need to be determined during the validation process. Random and systematic errors are measured in terms of precision and bias. The detection limit is established for each analyte. The accuracy and precision are determined at the concentration range where the method is to be used. The linear dynamic range is established and the calibration sensitivity is measured. In general, method validation provides a comprehensive picture of the merits of a new method and provides a basis for comparison with existing methods. A typical validation process involves one or more of the following steps: Determination of the single operator figures of merit. Accuracy, precision, detection limits, linear dynamic range, and sensitivity are determined. Analysis is performed at di¤erent concentrations using standards. Analysis of unknown samples. This step involves the analysis of samples whose concentrations are unknown. Both qualitative and quantitative measurements should be performed. Reliable unknown samples are obtained from commercial sources or governmental agencies as certified reference materials. The accuracy and precision are determined. Equivalency testing. Once the method has been developed, it is compared to similar existing methods. Statistical tests are used to determine if the new and established methods give equivalent results. Typical tests include Student’s t-test for a comparison of the means and the F-test for a comparison of variances. Collaborative testing. Once the method has been validated in one laboratory, it may be subjected to collaborative testing. Here, identical test samples and operating procedures are distributed to several laboratories. The results are analyzed statistically to determine bias and interlaboratory variability. This step determines the ruggedness of the method. Method validation depends on the type and purpose of analysis. For example, the recommended validation procedure for PCR, followed by capillary gel electrophoresis of recombinant DNA, may consist of the following steps: 1. Compare precision by analyzing multiple (say, six) independent replicates of reference standards under identical conditions. 2. Data should be analyzed with a coe‰cient of variation less than a specified value (say, 10%). 16 sample preparation: an analytical perspective
PRESERVATION OF SAMPLES 3. Validation should be performed on three separate days to compare precision by analyzing three replicates of reference standards under identical conditions (once again the acceptance criteria should be a ecified coefficient of 4. To demonstrate that other analysts can perform the experiment with similar precision, two separate analysts should make three independent measurements( the acceptance criterion is once again a prespecified RSI 5. The limit of detection, limit of quantitation, and linear dynamic range are to be determined by serial dilution of a sample. Three replicate measurements at each level are recommended, and the acceptance criterion for calibration linearity should be a prespecified correlation coefficient(say, an r- value of 0.995 or greater) 6. The molecular weight markers should fall within established migration time ranges for the analysis to be acceptable. If the markers are out side this range, the gel electrophoresis run must be repeated. 1. 4. PRESERVATION OF SAMPLES The sample must be representative of the object under investigation. Physi cal, chemical, and biological processes may be involved in changing the composition of a sample after it is collected. Physical processes that may degrade a sample are volatilization, diffusion, and adsorption on surfaces Possible chemical changes include photochemical reactions, oxidation, and precipitation. Biological processes include biodegradation and enzymatic reactions. Once again, sample degradation becomes more of an issue at low analyte concentrations and in trace analysis The sample collected is exposed to conditions different from the original source. For example, analytes in a groundwater sample that have never been exposed to sunlight. It is not possible to preserve the imte al reactions when indefinitely. Techniques should aim at preserving the sample at least until the analysis is completed. A practical approach is to run tests to see how long a sample can be held without degradation and then to complete the analysis within that time. Table 1. 3 lists some typical preservation methods These methods keep the sample stable and do not interfere in the analysi Common steps in sample preservation are the use of proper containers temperature control, addition of preservatives, and the observance of rec- ommended sample holding time. The holding time depends on the analyte of interest and the sample matrix. For example, most dissolved metals are
3. Validation should be performed on three separate days to compare precision by analyzing three replicates of reference standards under identical conditions (once again the acceptance criteria should be a prespecified coe‰cient of variation). 4. To demonstrate that other analysts can perform the experiment with similar precision, two separate analysts should make three independent measurements (the acceptance criterion is once again a prespecified RSD). 5. The limit of detection, limit of quantitation, and linear dynamic range are to be determined by serial dilution of a sample. Three replicate measurements at each level are recommended, and the acceptance criterion for calibration linearity should be a prespecified correlation coe‰cient (say, an r2 value of 0.995 or greater). 6. The molecular weight markers should fall within established migration time ranges for the analysis to be acceptable. If the markers are outside this range, the gel electrophoresis run must be repeated. 1.4. PRESERVATION OF SAMPLES The sample must be representative of the object under investigation. Physical, chemical, and biological processes may be involved in changing the composition of a sample after it is collected. Physical processes that may degrade a sample are volatilization, di¤usion, and adsorption on surfaces. Possible chemical changes include photochemical reactions, oxidation, and precipitation. Biological processes include biodegradation and enzymatic reactions. Once again, sample degradation becomes more of an issue at low analyte concentrations and in trace analysis. The sample collected is exposed to conditions di¤erent from the original source. For example, analytes in a groundwater sample that have never been exposed to light can undergo significant photochemical reactions when exposed to sunlight. It is not possible to preserve the integrity of any sample indefinitely. Techniques should aim at preserving the sample at least until the analysis is completed. A practical approach is to run tests to see how long a sample can be held without degradation and then to complete the analysis within that time. Table 1.3 lists some typical preservation methods. These methods keep the sample stable and do not interfere in the analysis. Common steps in sample preservation are the use of proper containers, temperature control, addition of preservatives, and the observance of recommended sample holding time. The holding time depends on the analyte of interest and the sample matrix. For example, most dissolved metals are preservation of samples 17