Gage design requires certain physical characteristics for reliable performance. A rigid and sound design, for instance, helps ensure that operators have as little influence on the measurement as possible. One of the most important principles is alignment.
This principle states that measurement is most accurate when the line or axis of the measurement coincides with the line of the scale or other dimensional reference. In the real world, it is rarely possible to design a gage in which the scale and axis of measurement coincide. But they should be as close as possible and definitely in the same plane.
Probably the simplest way to visualize this is in a caliper—vernier, digital or dial. All rely on certain alignments of the jaws to ensure correct measurement. While the caliper may not be one of the most highly accurate tools in the metrology tool box, it does typify the types of errors that can be found in more accurate gages, such as a horizontal measuring machine that needs to be able to perform to micro-inches.
Imagine a caliper that is in perfect condition, with a straight beam, highly machined and straight surfaces, flat jaws that are square to the beam, and a perfect scale. The line of measurement is pretty well displaced from the line of the scale, but it is in the same plane, and for any given separation of the jaws, the scale reading will correspond to the separation of the jaws. Now imagine that the reference scale was mounted to the beam of the caliper incorrectly, so it was not square to the jaws. This is unrealistic, but it shows how taking the scale out of planar alignment can distort the accuracy of measurement.
A more realistic scenario would add an amplified curvature to the beam of the caliper. With this type of curvature, the distance between the tips of the jaws is much less than the distance indicated by the scale reading. However, as we move the contact points of the measurement up on the jaws closer and closer to the scale, the reading gets more reliable, because it is closer to the reference.
If we put on our geometry caps, we can think about the actual errors being generated in this example. Let’s say the curvature of the beam is 0.001 inch over a 10-inch beam length. There is a useful rule of thumb we can use to help us figure this one out. It says that when the height of an arc is small in proportion to the length of a chord—which is always the case in examples like this—the apex of the triangle formed by tangents at the ends of the arc is twice the height of the arc.
Because we said the arc height is 0.001 inch, the height of the formed triangle is 0.002 inch. We have some simple right angles to work with, and we can calculate the angle between the tangents and the chord to be 2.7 degrees. If we assume the length of the jaws is 2 inches, we can calculate that the difference between the distance from jaw tip to jaw tip and the reading indicated on the caliper scale is 0.0016 inch.
Most vernier calipers do not have the resolution to see this small of an error, but dial and digital units may be able to read it. What’s important is that with this type of non-alignment condition, we are generating real errors. Now think about this condition in terms of a laboratory universal measuring machine or on a precision jig bore. Distances could be a lot greater, and so could errors.
In hand tools and measuring machines, every effort should be made in the design to make sure all measuring surfaces are aligned. But how do you, the user, tell if your gage is aligned well? With simple gages, you probably can’t, other than to look at them with the principle of alignment in mind. With measuring machines, looking at the specifications for straightness of the ways and squareness will give an indication of how closely the designers aligned the components.