The Rule of Thumb, Part 1

Guidelines used to standardize the measuring process can provide a good basis for making gage decisions.

Columns From: 3/17/2014 Modern Machine Shop,

Click Image to Enlarge

George Schuetz

The rule is to select the minimum graduation value marked on the dial indicator that is closest to 10 percent of the tolerance spread of the work you are measuring.

I did some figuring the other day and estimate, conservatively, that we have probably answered at least 50,000 gaging questions over the past 35 years or so. Some of these questions have been challenging. They have pushed me to learn more about my business and our industry, and to grow professionally. Many questions have to do with various “rules of thumb” that have been floating around the industry for ages.

Over the years, much has happened to standardize the measurement process and make it more reliable and repeatable. International, national and industrial organizations have put procedures in place to help ensure that the process—and its verification—is correct and followed every time.

Interestingly, most of these measuring processes are based on old rules that were studied and improved upon and their concepts eventually incorporated into one or more of today’s standards. The thing to remember is that, while these old “rules” may form the basis of today’s practice, they themselves are not necessarily the best practice. But when there is some shopfloor, part-quality disaster going on, referencing some of these basics may save the day.

The granddaddy of them all is the 10-to-1 rule, which most likely came out of early manufacturing, turned into a military standard and then evolved into some of the standards used today. Its basis still makes sense when arriving at those basic setup decisions that put good gages together.

For example, in gages with analog or digital readouts, the rule says the measuring instrument should resolve to approximately one-tenth of the tolerance being measured. This means if the total tolerance spread is 0.0002 inch, the smallest increment displayed on the gage should be 20 microinches. A gage that only reads to 50 microinches can’t resolve closely enough for accurate judgments in borderline cases and doesn’t allow for the observation of trends within the tolerance band. On the other hand, a digital gage that resolves to 5 microinches might give the user the impression of excessive part variation as a lot of digits go flying by when using the display. With that said, 10:1 is not readily achievable on some extremely tight tolerance applications—say, ±50 microinches or less—and it may be necessary to accept 5:1. But for coarse work, 10:1 or something very close to it is always a good recommendation.

Without a doubt, the most common question I am asked has to do with selecting a gage: “I’ve got a bushing with a 0.750-inch bore that has to hold ±0.001 inch. What kind of gage should I use?” There are a number of choices: a dial bore gage, an inside micrometer, an air plug, a self-centralizing electronic plug or any one of several other gages. But picking the right gage for an application depends basically on three things: the tolerance you are working with, the volume of components you are producing and the degree of flexibility you require in the gaging system.

For gage performance, we go back to our 10-to-1 rule: If your tolerance is ±0.001 inch, you need a gage with a performance rating of at least 10 times that, or within one-tenth (±0.0001 inch). A gage repeatability and reproducibility study (GR&R) may be a way of determining the gage performance. GR&R studies are designed to show how repeatable the specified accuracy is when the gage is used by a number of operators measuring a number of parts in the manufacturing environment. There is no single standard for GR&R studies, but generally, it is a statistical approach to quantifying gage performance under real-life conditions. Often this is expressed as the ability to measure within a certain range a certain percent of the time. The gage you pick should pass your own in-house GR&R requirements. 

Typically, the rule of thumb for selecting a master has been to choose one whose tolerance is 10 percent of the part tolerance. This, combined with the gage’s performance, should provide adequate assurance of a good measurement process. It’s usually not worthwhile to buy more accuracy than this 10-to-1 rule; it costs more, it doesn’t improve the accuracy, and the master will lose calibration faster. On the other hand, when manufacturing to extremely tight tolerances, you might have to use a ratio of 4:1 or even 3:1 between gage and standard, simply because the master cannot be manufactured and inspected using a 10:1 rule.

These are examples of where a rule of thumb may be the basis for a gage decision. It is not necessarily the final decision, but provides a way of working towards the best choice based on accepted test processes. There are many other rules relating to surface finish checking and gage design, and we’ll look at some of them next month.

Comments are reviewed by moderators before they appear to ensure they meet Modern Machine Shop’s submission guidelines.
blog comments powered by Disqus
MMS ONLINE
Channel Partners
  • Techspex