As with any precise measurement device, calibration of a load cell, or the scale that uses it, is an exact science. Before delving into the subject of calibration, let’s take a closer look at the intricacies of the device, to get a better idea of exactly what we will be calibrating. A load cell is a type of transducer (meaning that it converts one type of energy into a different type of energy) which reads an initial amount of force or weight, and then converts this force into an electrical signal.
This does not happen by way of a direct conversion, but rather occurs in two separate stages. First, the force detected effects one or more strain gauges, effectively deforming the gauges, which causes its electrical resistance to change. This relative change is then used to measure the strain. While load cells containing one or two strain gauges do exist, the most common of these devices contain four strain gauges. The output of this electrical signal is then connected to circuitry which calculates and provides a digital value of the converted electrical signal of the force or weight originally applied to the load cell.
After taking a more in-depth look at how load cells provide measurements, it is easy to see how the calibration of these devices can be a delicate science. In addition, any load cell in official use today must be calibrated to NIST (National Institute of Standards and Technology) standards. NIST, being around since 1901, has worked hard to provide the most accurate and effective guidelines for calibration of countless types of measuring systems. This being said, when it comes to the intricacies of calibration, one would be hard-pressed to find a more expert source available. Their suggestions for the calibration of a load cell are as follows:
- At full-scale load, the output it typically 2 mV/V or 3 mV/V, i.e., one part in 500 of the excitation or one part in 333 of the excitation. There fore, for high-precision calibrations, a ratio measurement is made. We use an 8-1/2 digit digital voltmeter in ratio mode, which has an error limit specification for this measurement of 35 ppm for a year from calibration. This DVM also communicates with the PC by the IEEE-488 bus.
It is also noted in the NIST requirements that these force calibrations are issued with a given 20 ppm uncertainty factor for the force values, with an additional 50 ppm or 33 ppm uncertainty when considering the readout instrumentation. It is also mentioned that the accuracy of these measurement ratios can be improved significantly by more frequent calibration. To reduce the measurement uncertainty by a factor of ten, a single-point ratio calibration at full scale of the range used (100mV), or a 100:1 ratio, can be used. During NIST tests, a self-calibrating 100:1 ratio divider was used, resulting in an uncertainty of only 0.5 ppm for this calibration.
As said before, the calibration of such delicate measuring systems can be quite a challenge, but with persistence and the required work, frequent calibration is guaranteed to improve the accuracy and effectiveness of load cells and any other measurement systems in use.