Automating the Indicator Calibration Process
Eliminate human error when calibrating precision hand tools by leveraging modern vision systems.
All gaging equipment must be calibrated periodically to ensure that it is capable of measuring parts accurately. This is true for every hand tool or gage used in a manufacturing environment that verifies the quality of parts produced. This has always been necessary for maintaining quality, but there are also additional, external reasons to establish and maintain a regular program of gage calibration, mainly customer requirements. It is now common that companies request suppliers to document their quality efforts from start to finish.
Some large companies with thousands of hand measuring tools, dial/digital indicators and comparators can justify the cost of hiring or training specialists in gage calibration methods and supply them with equipment to perform in-house calibration. However, dial and digital indicator or comparator calibration can be a very time-consuming and operator-intensive process.
Example of points required for checking an indicator.
Most dial indicators are relatively short-range but need to be checked at multiple points throughout their range to verify performance accuracy. They then need to be checked again in the reverse direction to verify hysteresis requirements. Historically, most dial indicator calibrators have been built around a high-precision mechanical micrometer, in effect, turning the micrometer to a known point and then observing any deviation on the dial indicator. Even for a short-range indicator, the process will involve moving a mechanical dial calibrator by hand to 20 or more points along the indicator’s travel. This is not too difficult for a short-range indicator, but with a longer-range indicator, say 12.5, 25, 50 or even 100 mm of range, there are a lot of positions to go to and points to observe and record.
This can also take a significant amount of time and concentration by the user. Doing this for many indicators throughout the day is stressful for the operator not only in hand-positioning the micrometer head to hundreds if not thousands of points, but also the resulting eye strain from reading the micrometer head and the indicator. The reading is also problematic since people will naturally (unintentionally) reverse numbers or just misread. Alternatively, in the case of a dial indicator, not reading the indicator straight on causes a parallax effect and a misreading of the result.
In order to reduce operator stress and increase productivity, automated calibrators are available that, based on the indicator, will drive a precision spindle to the desired location. The operator can then read and record the deviations. These machines will significantly reduce the hand/arm strain caused by the constant rotational driving of the micrometer head. This is a significant improvement. However, there are better options.
The real improvement would be to eliminate the operator by installing the indicator into a calibration tool, setting the parameters within the gage for the indicator, and then letting the gage measure and certify the indicator without operator involvement. This allows the gage technician to be productive preparing the next indictor to be checked, signing the indicator certifications, or even starting another calibration process while the automated calibrator is working.
Systems can “read” the indicator to capture its values.
With today’s modern vision systems, it is possible to “read” the dial/digital indicator or comparator. By read, I mean the vision system can actually know what the indicator and the dial should be and process an image to read the pointer relative to the graduations and interpolate this as a measurement. In the case of digital indicators, the digital dial is scanned by the system's camera, the digits are analyzed/“read” by the controller and the actual deviation between measurements is made.
Because of this automation with image processing, what once was a labor-intensive process with a high risk of error is now faster. Also, it reduces uncertainties while preventing potential stress and injuries to the operator. With the auto-recognition of the vision system, more test items with more data points will be recorded faster than conventional, manual methods. This frees the operator to be productive during the automated measuring process.
Related Content
How to Choose the Correct Measuring Tool for Any Application
There are many options to choose from when deciding on a dimensional measurement tool. Consider these application-based factors when selecting a measurement solution.
Read MoreThe Link Between CNC Process Control and Powertrain Warranties
Ever since inventing the touch-trigger probe in 1972, Sir David McMurtry and his company Renishaw have been focused on achieving process control over its own manufacturing operations. That journey has had sweeping consequences for manufacturing at large.
Read MoreBallbar Testing Benefits Low-Volume Manufacturing
Thanks to ballbar testing with a Renishaw QC20-W, the Autodesk Technology Centers now have more confidence in their machine tools.
Read MoreThe Many Ways of Measuring Thickness
While it may seem to be a straightforward check, there are many approaches to measuring thickness that are determined by the requirements of the part.
Read MoreRead Next
Portable Surface Gaging in a Production Environment
Handheld portable surface gages are easy to use. But that doesn’t mean there aren’t challenges when scanning hundreds or even thousands of parts.
Read MoreHow To Calibrate Your Calipers
If you’re interested in calibrating your own digital, dial or Vernier calipers, here are some steps to take to make sure it goes off without a hitch.
Read MoreHow To Measure Surface Roughness on Large Parts
Performing surface finish measurements on large or complex parts can be made much easier with properly designed fixturing.
Read More