Looking through some older electronic and air gaging catalogs, I noticed that the term "magnification" appeared frequently. In today's world of digital indicators and amplifiers, this term is often left out of the description, simply because digital electronics work a little differently than older, analog amplifiers. Today, the term magnification is more apt to be used with optical comparators or vision systems because they do what the word implies. The trouble is, many people make the wrong implication.
Optical comparators use a form of a magnifying glass, the origin of which goes back nearly 2,000 years. Magnifying something is really the process of making something bigger, in appearance, but not in physical size.
The magnifying glass was first used in the quality world to enhance the ability of human eyes to see certain aspects of an object. Through the use of these lenses, early users were able to discern features that could not be seen otherwise. But while early optical instruments were able to make objects appear bigger, they did not have the capability to actually perform measurements. To be able to measure, you must first have some standard to which to compare the measurement.
The first use of magnification to measure with optics was the shadowgraph. In these instruments, projecting light over an object created a shadow. The magnified image was superimposed on a ruler that acted as the reference standard. The lenses were made and positioned to magnify the image to a certain multiple of the original size, for example 5×. Optical gaging advanced another step when micrometers or indicating scales were added to the position devices on the gage. With these, actual measurements of the parts could be made using the micrometer scale as the measurement standard.
In all of these examples, the use of magnification is simply to make the object being observed appear larger.
But magnification is also used in dial indicators and in any electronic amplifier that has an analog meter as part of the readout. In these cases, the readout hand (the dial indicator hand or amplifier needle) is moving more than the actual displacement of the sensing member. In a dial indicator, there are gears that act like levers to magnify the movement seen at the indicator contact point. With an electronic amplifier, there is circuitry that amplifies the input signal from the sensor. Amplification is much like magnification, except that the amplified electrical signal is actually made larger, as opposed to simply appearing to be larger. But the end result of the amplification of the electrical signal is a way of making a meter hand move to represent the magnified displacement.
Air gages also have used the term magnify to express their measuring capabilities. Often they will refer to having a 2,500, 5,000 or even a 10,000 magnification. But what this magnification is referring to is actually how much more the meter hand is moving than the actual sensing end is moving. In truth, we're talking about signal amplification, but the end result is to make the object appear bigger.
But what does the phrase "2,500 times magnification" actually mean? While not all amplifier systems are quite the same, they still use the power of magnification.
In the simplest example, if the actual scale of the electronic amplifier, or air gage, is 7.5 inches long, and the total range of the measuring instrument is 0.003 inch, then the magnification is 2,500 (7.5/0.003 = 2500). If we had the same length of indicating scale, but the measurement range of the sensor was 0.015 inch, then the magnification of the systems would be 5,000. You can also use this formula to figure out the length of the scale on your amplifier. For example, if the manufacturer of your display states that the magnification is 16,000 and the measuring range is 0.0005, then the length of the scale is 16,000 × 0.0005 and your scale is 8 inches long.
Why is any of this important? Because users are too often fooled into thinking that higher magnification means better or more accurate measurement. But it does not. Why? Because any error inherent in your measurement is going to get magnified just as much. That is what gaging uncertainty is all about. But that's a topic for another column.
The point to remember is that whether you are actually viewing something that appears bigger, as on an optical comparator, or watching the needle of an amplifier move, you are seeing a magnified result, not necessarily a more accurate one.