As companies look to implement computer-aided process planning along with other manufacturing information systems, they will do well to scrutinize the needs of all system users.
One of the most important steps in converting a design concept into a manufactured product is process planning. The essence of that task is the creation of a complete package of information on how to perform the manufacturing process, which may include work instructions for the shop floor, a bill of material, a quality control plan, tool planning, and so on. Also, there may be links to other manufacturing systems such as MRP (material requirements planning), PDM (product data management), time standards, engineering and manufacturing change control, shopfloor control and data collection systems. In most cases, this initial package of information ultimately determines the final cost and quality of the product.
Traditionally, manufacturing engineers produced the necessary process planning documents from scratch using manual techniques. That required the retrieval and manipulation of a great deal of information from many sources including established standards, machinability data, machine capabilities, tooling inventories, stock availability and, hopefully, existing practice. The resulting process plan was then manifest in the form of printed text, lists and drawings.
The introduction of computers into manufacturing has certainly made the planning function more efficient, but there are additional advantages. For one, computers can readily perform vast numbers of comparisons and, therefore, many more alternative plans can be explored than would be practical in a manual setup. Also, the application of computers can bring greater uniformity to process planning. Ask ten engineers to develop a process plan for the same part, and you will probably end up with ten different plans. Not only does this mean some plans will be better than others, but also that essentially similar jobs planned at different times will be done differently. However, with the comparative capabilities brought about by computer-aided process planning (CAPP), it becomes easier to answer the questions: Which plan best utilizes the facility's capabilities? Which can be used for estimating future work? Which is best for scheduling and shop loading? And most important, which plan reflects the best practice based on past experience?
While CAPP can indeed answer these questions, to be of optimum value, particularly in larger manufacturing facilities, companies must carefully consider its implementation and integration with other systems. Here are some factors to think about if CAPP is to achieve its potential.
CAPP got its start with Group Technology (GT), which was touted as a solution to manufacturing in an environment of smaller lots and shorter product life cycles. The underlying principle of GT is relatively simple: Use a well-structured coding and classification system to identify similar components and processes. Then once "families of parts" are identified, they can be manufactured with standardized process plans.
Early CAPP systems were based on this general principle, and still are, though there are now basically two approaches to how systems work--variant and generative. In the variant approach, a set of standard process plans is established for all the parts families identified through GT. Then when a new plan is required, an applicable standard plan is retrieved and edited to suit the specific requirements of the new part.
In the generative approach, an attempt is made to synthesize each individual part using appropriate algorithms that define the various technological decisions that must be made in the course of manufacturing. In a truly generative system, the sequence of operations as well as all the process parameters would be established automatically, without reference to prior plans. The costs of setting up such a system are so high, however, that so-called generative process planning systems have been developed only for specific operations--selection of feeds and speeds, for example--or for uniform families of similar parts.
Although some early CAPP systems contained elaborate classification and retrieval capabilities, coding all parts in a typical manufacturing environment proved to be unrealistic. It was simply too tedious, time-consuming and expensive. And in time, many of these systems were used primarily as word processors with some retrieval of standard texts. Nevertheless, this was a great improvement over the old ways of paper-driven process planning. It simply lacked the sophisticated retrieval and modification capabilities of a modern CAPP system.
Also critical to the capabilities of today's CAPP is the ability to integrate with other data-management systems. A host of computer based technologies--such as CAD, CAM, CNC, CAPP and statistical QC--all have the potential for optimizing the manufacturing process, but only if they are used to reduce redundancy and to learn from our past mistakes.
Early attempts to execute this integration were carried out on mainframe computers, and with a great deal of custom software, often developed by the users themselves. The trend now, however, is toward distributed processing in a network environment and "COTS" (commercial off-the-shelf software). Still, every company wants its own bells and whistles and various print routines or screen displays geared to the presentation requirements of its departments. Therefore, to be effective, a so-called COTS CAPP system must have the capability of being tailored to specific interface and output requirements. One way of achieving such flexibility is to incorporate a macro language that enables the installer to fine tune the system without having to alter the underlying software code.
One of the pitfalls of applying new computer-based technologies is that they make it easier to come up with new designs and methods but not necessarily those that are the most efficient for a particular manufacturing environment. Some may argue that starting from scratch with a clean design and a fresh manufacturing plan will more likely ensure everything is done correctly. However, starting from scratch can result in part designs that are functionally interchangeable but with widely varying manufacturing costs due to differences in tolerances and materials.
In fact, it has become increasingly apparent that when a new product is designed, roughly 80 percent of its parts are either the same or similar to parts already developed for other products. To reinvent these parts from scratch is a waste of effort no matter how fast and easily it can be done. Moreover, this approach increases the likelihood of mistakes sneaking through to the later stages of production because the new design or plan never went through the rigors of the manufacturing process. For these reasons, effective retrieval of existing manufacturing experience becomes essential.
GT often requires an analysis of 2,000 to 3,000 parts before the total of code numbers needed to represent all part families starts to level off. However, part families really represent combinations of basic part features such as holes, slots or pockets, and the number of features represented at any one company is usually quite limited. Furthermore, process plans are determined by part features, not part families.
In fact, most company divisions deal with somewhere between 20 to 30 different features, which usually can be flushed out by analyzing between 100 and 200 randomly selected parts. Each feature, in turn, has a small number of manufacturing methods associated with it--about 15 methods on average, depending on feature size, material and tolerance. This means that most companies deal with no more than 500 distinct manufacturing methods, which is a very manageable number for retrieval purposes.
What about artificial intelligence? At the end of the 1980s you could barely sell a CAPP system without offering a rule-based generative capability. But actual use of this capability has been minimal. At a conference of some 60 U.S. and European companies sponsored by Houtzeel Manufacturing Systems in 1991, the application of AI in CAPP was explored. Several participants had invested substantial efforts in the development of AI-based generative process planning systems. But all had come to the conclusion that such systems, though technically feasible, provide unacceptably low returns on investment, particularly in dealing with general detail parts or assembly operations.
However, where groups of similar parts or assembly operations are concerned, such as the manufacture of turbine blades, they determined an AI-based system could be profitable. In other words, a global application of AI to provide generative process planning for the entire universe of parts manufactured by a company is unrealistic. However, using GT analysis, it is possible to arrive at a limited number of part families where an expert system can be cost effective.
It becomes increasingly important to implement a manufacturing infrastructure compatible with today's installed computer networks of mainframes, workstations and PCs to provide access to the many heterogeneous systems. Those systems may serve such diverse activities as CAD, CAM, MRP, SPC, tool management and work time measurement.
To be effective, such a manufacturing information system must be able to create, check and deliver information in a seamless and user-friendly manner, preferably employing a common user interface. The system must be capable of creating and delivering information in a number of ways--whether designs and process plans are created from scratch, or with generative or variant approaches--including:
As manufacturing information is created, it must be made available to all affected parties for concurrent review, and that review process should be properly documented. Moreover, the delivery of information among the various players in the manufacturing cycle should be tailored to their individual needs and expectations. While the manufacturing engineer may compose a complete process planning package including tool, QC, work instructions, and so on, the machine tool operator may require, among other things, a very detailed setup plan that includes graphics, digital photos or video, NC programs and data, or inspection programs and instructions.
Finally, and most important, the system must provide an effective feedback mechanism. It is not sufficient that information flows only from the top down, or that it consists solely of instructions to the next phase in the manufacturing cycle. People at each node in the manufacturing network must be able to annotate the information they receive with appropriate comments reflecting their experience and know-how and pass it back among the affected parties. As an example, red-lining (the electronic attachment and tracking of relevant notes) between manufacturing and design engineering can be used to help ensure that an emerging design is compatible with the best available manufacturng processes.
Or, when sections of the entire process planning package are disseminated to different manufacturing groups, a feedback mechanism should be in place to signal errors or the potential for better practices to the process planner. This way the information system is continuously updated, which makes future retrieval more relevant. Continuous improvement becomes a reality.
In the use of earlier mainframe-based systems, highly trained engineers tended to specialize in particular tasks such as process planning, bills of material, and so on. But today's manufacturing engineers are required to perform all these tasks, and more efficiently. Consequently, it becomes important that the user interface be similar or the same for all these tasks in order to reduce learning time and errors. Some companies are now insisting that all functions within the manufacturing information handling system employ the same user interface for functions as diverse as engineering and manufacturing change orders, configuration control, process planning, bills of material, tool management, time standards, shopfloor control and data collection, shopfloor access and feedback, and quality control.
Considering this wide range of activities, it only makes sense that uniform user interfaces can substantially improve user efficiency and reduce errors.
However, as companies implement PDM and MRP systems, the need to address the requirements of manufacturing information management often gets lost in the shuffle. Management tends to divide company activities into two major functions--design and production--and not three, so the manufacturing engineering function is generally treated as a secondary issue. The reality is that the manufacturing information handling function is far more complex and important to the success of a company than top management realizes.
Management tends to assume that PDM and MRP can provide all necessary manufacturing information, but in fact neither can provide appropriate access for developing and updating entire process planning packages. For example, MRP systems may provide process planning capabilities but they do not typically have the manufacturing information handling capacity or integration with the graphics capabilities now seen in modern CAPP systems.
Or, as another example, PDM databases are developed according to engineering features and therefore are not necessarily relevant to manufacturing considerations. Functionality or part features are the key factors governing the structure of a product-oriented database while the process planner deals with manufacturing elements and processes such as tools and fixtures, turning, milling, hardening, and so on. So when companies suggest that they intend to maintain one overall product database developed from an engineering/design point of view, they overlook manufacturing engineering's need for access to past manufacturing experience that is so essential to the continuous improvement process.
A critical issue, yet unresolved, is who should maintain authority over such a database. At present, PDM advocates insist that they are the appropriate agency to maintain and control the product information database and that any changes or updates must go through their approach. But unless the PDM system maintains an appropriate database where items can be searched for by engineering and manufacturing features, it is not possible to integrate the manufacturing function--and management's vision of an overall product database becomes an illusion.
At present, the UNIX client/server approach best serves the need for a distributed network because it provides adequate security for the data by strictly controlling access. Different client/server approaches are available. For one, the database can reside on the server and the manufacturing information handling system software (such as CAPP) resides on individual workstations. Or, both the database and system software can reside on a larger server that is accessed with X-Window terminals. And in the near future, the system security of PCs may be sophisticated enough to have system software running on this platform with a database located on either a UNIX or PC server.blog comments powered by Disqus