Errors
Making multiple calibration standards for any instrumental method of analysis requires the measurement of a volume of a solution of known concentration and the addition of a solvent to reach the concentration of the calibration standard. Each calibration standard must be prepared in this manner. Each time a volume is measured, whether it is of the solution of known concentration or of the solvent, there is the potential to make an error. Errors in making the calibration standards can cause the results to be in error. The magnitude of the errors made in the volume of the solution of known concentration can be more than an order of magnitude if the calibration standards are for trace analysis.
Serial dilution only requires the measurement of a volume of the solution of known concentration one time. Each successive calibration standard derives from the previous standard. The magnitude of the error in each calibration standard becomes smaller and smaller as the concentration of the calibration standard drops.
Easier and Faster Preparation of Calibration Standards
Each calibration standard solution is prepared based on the previous calibration standard. The process involves taking a portion of the previous standard and diluting it with the solvent to obtain the next calibration standard. The errors introduced with each successive dilution drops proportionately with the solution concentration. Preparing a series of calibration standards by this method reduces the amount of required time. Most calibration standards span a large range of concentrations, so the accuracy of the calibration standard prepared increases.
Calibrations Solutions More Evenly Spaced
The calibration standards should span the entire concentration range of the analysis. The more evenly spaced the calibration standards are over this range, makes the results of the analysis more reliable. Evenly spaced calibration standards are easier to prepare using serial dilution. Each successive standard uses a small portion of the previous standard, which is diluted by solvent to generate the next calibration standard in the series.
Greater Variability in Calibration Range
The dilution factor chosen for the series of calibration standards is achievable by using serial dilution. The progression of calibration standard concentration is always a geometric series. Consider the example of making the first standard at 1/3 the concentration of the known, the next calibrant would be 1/9th the concentration of the known and the following two calibrants formed are 1/27th and 1/81st. This becomes a much greater advantage when the span of the calibration standards must cover several orders of magnitude in concentration.