Calibration is the activity of checking, by comparison with a standard, the accuracy of a measuring instrument of any type. It may also include adjustment of the instrument to bring it into alignment with the standard. Even the most precise measurement instrument is of no use if you cannot be sure that it is reading accurately – or, more realistically, that you know what the error of measurement is. Let’s begin with a few definitions:
- Calibration range – the region between the within which a quantity is measured, received or transmitted which is expressed by stating the lower and upper range values.
- Zero value – the lower end of the calibration range
- Span – the difference between the upper and lower range
- Instrument range – the capability of the instrument; may be different than the calibration range.
Be careful not to confuse the range the instrument is capable of with the range for which the instrument has been calibrated.Ideally a product would produce test results that exactly match the sample value, with no error at any point within the calibrated range. This line has been labeled “Ideal Results”. However, without calibration, an actual product may produce test results different from the sample value, with a potentially large error. Calibrating the product can improve this situation significantly. During calibration, the product is “taught” using the known values of Calibrators 1 and 2 what result it should provide. The process eliminates the errors at these two points, in effect moving the “Before Calibration” curve closer to the Ideal Results line shown by the “After Calibration” curve. The error has been reduced to zero at the calibration points, and the residual error at any other point within the operating range is within the manufacturer’s published linearity or accuracy specification. Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and Instrumentation Dictionary, the definitions for each are as follows:
- Accuracy – the ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent span or percent reading, respectively.
- Tolerance – permissible deviation from a specified value; may be expressed in measurement units, percent of span, or percent of reading.
- Requirements of the process
- Capability of available test equipment
- Consistency with similar instruments at your facility
- Manufacturer’s specified tolerance