Calibration, One Step at a Time
A gage that meets performance specs will ensure better measurement results.
Calibration is a process used to ensure that a measurement device meets its performance requirements. The device, a gage block, for example, is brought into an environmentally controlled room and measured against traceable standards by a trained operator using specialized equipment and a defined measuring process. All potential errors relating to the process are identified and their potential effects quantified.
Once the gage block has been measured, it is determined to be either within specification or not. A certificate is issued stating that the block is within defined limits, if that is the case, and providing the actual recorded measurements. A block found to be out of spec is usually replaced or downgraded to a lower grade in which it does meet the calibration specs. There is not much that can be done to bring an out-of-spec gage block back into spec.
Most other types of gages are different from gage blocks in this respect. If the gage does not meet specification after the calibration process is performed, there are usually ways to adjust the performance of (calibrate) the gage so that it does meet the specification. This is true for handheld gages like calipers and micrometers, as well as dial indicators, bench amplifier/probe combinations, and surface-finish or form-measuring systems.
Regardless of the type of gage, all calibration processes have certain steps in common, whether you send your gages out or calibrate them in house. Let’s take a look at a typical calibration process for a vernier, dial or digital caliper. All three of these instruments typically have the capability of measuring outer diameter (OD), inner diameter (ID) and depth with the use of an integrated depth rod. The calibration process involves comparing all these readings against known standards. In addition, ID and OD jaw parallelism also needs to be verified.
Gage blocks are often used for calibration, but specifically designed masters also are available that make the process of checking calipers a little more efficient. And while not a hard and fast requirement, the inspection usually involves making measurement checks at fixed points (every 1 inch increment, for example) or at 25, 50, 75 and 100 percent of the instrument’s range. For dial calipers, it may be wise to select a couple of blocks that fall within the range of one revolution of the dial to check its short-range performance.
Prior to starting the calibration process, gather the right tools for the job: disposable wipes and alcohol; gage blocks or a caliper test master (and the documentation that they have been recently certified); master rings; a master pin; a surface plate; and calibration stickers. Of course, all of these will be used in a room environmentally stabilized for temperature, vibration and humidity.
Once the masters and caliper have settled to ambient temperature after being cleaned with the wipes and alcohol, the inspection process can begin. First, check repeatability to zero by bringing the jaws together and zeroing the item. Next, open and close the jaws multiple times to repeat the zero reading. Significant non-repeat is a sign of either dirt or worn guideways, which should be addressed before going any further.
Next, check the parallelism of the OD jaws with the precision pin. Place the pin between the jaws at the top and zero the gage, then move the pin to the center and tip of the jaw and compare the results. Consult the manufacturer’s specifications, but there should be virtually no difference in readings. Though a little more difficult, the same test can be performed on the ID jaws using the master ring.
Now the calibration process for the caliper can begin. Close the jaws and zero the caliper. Using the gage blocks or a test piece, measure each incremental step, open and close to check repeat, and record the values. Compare the results to the manufacturer’s specifications, then repeat the process with the ID jaws. Finally, check the depth rod with a 1-inch gage block set on the surface plate by zeroing on the plate and then measuring the depth. Again, consult the manufacturer’s specifications for acceptable variation.
Hidden within all these steps are the usual sources of error that should be considered:
• Variations in the measuring force applied to the masters (too much force can distort the gage).
• Masters that have not recently been certified.
• Temperature of the masters and caliper.
• Jaws that are not parallel, are worn, have burrs or are dirty.
• Caliper beams that are bent or warped.
• Loose hand on a dial caliper and loose gibbs (some calipers have adjustments for tightening up the fit between the fixed and the sliding jaw).
• Temperature through handling.
If a caliper does not meet the manufacturer’s specs and you have addressed all these sources of error, there is little left to do to bring the caliper into spec. It may be time to invest in a replacement.
Caliper calibration is about as basic as it gets, but almost every one of the errors identified in this process can be found one way or the other when calibrating dial indicators, micrometers, gage blocks or complicated measuring systems as well. Start small, identify these basic processes and, with practice, gage calibration can become a routine for ensuring better measurement results.
Dial and test indicators are close cousins. They are both mechanical magnifying devices used for dimensional comparison.
While countersunk and chamfered holes are similar in appearance, functionally they are quite different. Consequently, different gages exist to serve these different functional requirements.
If you’re interested in calibrating your own digital, dial or Vernier calipers, here are some steps to take to make sure it goes off without a hitch.