banner



Design Tools Software Niebel Instructor Edition Cycle Time

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

3
Tools for Virtual Design and Manufacturing

Five technical domains have been identified in which virtual design and manufacturing tools exist or where important areas of knowledge and practice are supported by information technology: systems engineering, engineering design, materials science, manufacturing, and life-cycle assessment. However, progress is needed in order to more fully take advantage of these models, simulations, databases, and systematic methods. Each of the domains is largely independent of the others, although links are being made, bridges are being built, and practitioners and researchers in each domain recognize the value of knowledge in some of the other domains. Intercommunication and interoperability are two prerequisites for serious progress. Formidable technical and nontechnical barriers exist, and the committee offers recommendations in each domain.

TOOL EVOLUTION AND COMPATIBILITY

Throughout human history tools have evolved, typically driven by technological availability, market dynamics, and fundamental need. In agriculture, teams of oxen have been replaced by sophisticated tractors with specialized attachments. Computing tools have morphed from fingers and toes to abacuses to slide rules to calculators to high-performance computers. The software used within these computing systems has evolved in terms of programming levels of abstraction and overall functionality. Software not only is written as an end item that operates within a product, but now also gets developed as models and simulations to emulate the end item itself in order to perfect its eventual production, field use, and retirement. Software-based tools are developed to create and use these models and simulations to best perform design, engineering analyses, and manufacturing. Table 3-1 lists examples of available tools and the areas in which they operate.

Advanced engineering environments (AEEs) are integrated computational systems and tools that facilitate design and production activities within and across organizations. An AEE may include the following elements:

  • Design tools such as computer-aided design (CAD), computer-aided engineering (CAE), and simulation

  • Production tools such as computer-aided manufacturing (CAM), manufacturing execution system, and workflow simulation

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

TABLE 3-1 Representative Tools Used in the Industry

System Life Cycle

Engineering/Technical Cost Analysis

systems Engineering

Activity

Marketing

Product Engineering

Industrial Engineering

Marketing

Function

Action

Mission, or Customer Needs

Requirements Analysis

Product Planning

Functional Analysis

Product Architecture

Synthesis

Engineering Design

Analysis, Visualization, and Simulation

Manufacturing Engineering

Analysis and Visualization

Manufacturing Operations

Production and Assembly

Field Operations

Use, Support, and Disposal

Business Case

Forecasting

@Risk, Crystal Ball, Excel, i2, Innovation Management, JD Edwards, Manugistics, Oracle, PeopleSoft, QFD/Capture, RDD-SD, SAP, Siebel

Arena PLM, Eclipse CRM, Innovation Management, MySAP PLM, RDD-SD, specDEV, TRIZ

Arena PLM, Innovation Management, MySAP PLM, RDD-IDTC, specDEV, TRIZ

Arena PLM, Innovation Management, MySAP PLM, RDD-IDTC, specDEV, TRIZ

Arena PLM, Innovation Management, MySAP PLM, specDEV, TRIZ

Arena PLM, MySAP PLM, specDEV, TRIZ

Innovation Management

Product Life-Cycle Planning and Management

Innovation Management, QFD/Capture, RDD-RM

Geac, I-Logix, Innovation Management, Invensys, JD Edwards, Oracle, PeopleSoft, RDD-SA, SAP, Windchill

Innovation Management, RDD-SD

Functional Prototyping, RDD-SD

Functional Prototyping

Innovation Management, RDD-SD

Resource Planning

Project, RDD-DVF, RDD-SD, TaskFlow Management

Innovation Management, RDD-DVF, RDD-SD

DSM, Geac, Invensys, JD Edwards, Oracle, People Soft, RDD-SD, SAP, TaskFlow Management

DSM, Project, RDD-SD, TaskFlow Management

TaskFlow Management, HMS-CAPP

TaskFlow Management

TaskFlow Management

Computer-aided engineering

Modeling

Caliber, DOORS, RDD-SD, RDD-OM, Innovation Management, Statemate

Caliber, DOORS, Innovation Management, RDD-OM, Statemate

ADAMS, Caliber, DADS, DOORS, Dynasty, EASA, Engineous, Innovation Management, LMS, MatLab, MSC, Opnet, Phoenix, RDD-OM, RDD-SD, Statemate, VL

Abaqus, AML, Ansys, AutoCAD, AVL, Caliber, CATIA, DOORS, EASA, EDS, Engineous, Fluent, Functional Prototyping, IDEAS, MSC, Opnet, Phoenix, ProE, RDD-SD, StarCD, Statemate, Unigraphics, Working Model

Caliber, DFMA, DOORS, Functional Prototyping

Caliber, DOORS

Caliber, DOORS, Innovation Management

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

System Life Cycle

Engineering/Technical Cost Analysis

Systems Engineering

Activity

Marketing

Product Engineering

Industrial Engineering

Marketing

Function

Action

Mission, or Customer Needs

Requirements Analysis

Product Planning

Functional Analysis

Product Architecture

Synthesis

Engineering Design

Analysis, Visualization, and Simulation

Manufacturing Engineering

Analysis and Visualization

Manufacturing Operations

Production and Assembly

Field Operations

Use, Support, and Disposal

Computer-aided engineeing

Simulation

Caliber, DOORS, Innovation Management, RDD-DVF, Statemate

Caliber, DOORS, Innovation Management, RDD-DVF, Statemate, Working Model

Caliber, DOORS, CATIA, Delmia V5, Enovia V5, RDD-SD, Innovation Management, Statemate

Abaqus, AML, ANSoft, Ansys, Caliber, DICTRA, DOORS, DYNA3D, EASA, EDS, Engineous, Functional Prototyping, ICEM CFD, LMS, ModelCenter, MSC, NASTRAN, Phoenix, RDD-SD, Statemate, Stella/Ithink

Caliber, DOORS, Functional Prototyping, HMS-CAPP

Caliber, DOORS

Caliber, DOORS, Innovation Management

Visualization

Innovation Management, RDD-OM, Statemate

Innovation Management, RDD-OM, RDD-SD, Statemate

CATIA, Delmia V5, EDS, Enovia V5, Innovation Management, Jack, RDD-SA, Slate, Statemate

Abaqus, ACIS, Amira, Ansys, EDS, EnSight, Fakespace, Functional Prototyping, Ilogix, Jack, MatLab, Open-DX, RDD-SD, Rhino, SABRE, Simulink, Slate, Statemate, VisMockup

Functional Prototyping, Statemate

Innovation Management

Computer-aided manufacturing

Product Data Management

Innovation Management

Innovation Management

CATIA, Delmia V5, Enovia V5

CATIA, Dassault, Delmia V5, EDS, Enovia V5, Metaphase, PTC, Windchill

Innovation Management

Electronic Design Automation

Caliber, DOORS

Caliber, DOORS, MatLab

Caliber, Doors, Integrated Analysis, Simulator, Verilog-XL

Cadence, Caliber, Dassault, DOORS, Integrated Analysis, Neteor Graphics, PTC, System Vision

Caliber, DOORS

Caliber, DOORS, PADS

Caliber, DOORS, Integrated Analysis

Manufacturing System Design

Functional Prototyping, RDD-ITDC, RDD-SD

Innovation Management, RDD-ITDC, RDD-SD

Integrated Data Sources, RDD-ITDC, RDD-SD

CimStation, Envision/Igrip, Integrated Data Sources, RDD-ITDC, RDD-SD

CIM Bridge, EDS, Tecnomatix

Functional Prototyping

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

Computer-aided manufacturing

Manufacturing System Modeling

Functional Prototyping, RDD-ITDC, RDD-SD

Functional Prototyping, RDD-ITDC, RDD-SD

DICTRA, Functional Prototyping, Pandat, RDD-ITDC, RDD-SD, Thermo-Calc

Abinitio, CimStation, Dante, DEFORM, Envision/Igrip, Functional Prototyping, MAGMA, ProCast, RDD-ITDC, RDD-SD, SysWeld

Abinitio, Arena, Dante, DEFORM, Extend, Functional Prototyping, MAGMA, Pro/ Model, ProCast, Simul8, SysWeld, TaylorED, Witness

Abinitio, Dante, DEFORM, Functional Prototyping, MAGMA, ProCast, SysWeld

Functional Prototyping

Manufacturing System Simulation

Functional Prototyping

Caliber, DOORS, Functional Prototyping

Abinitio, Caliber, CimStation, Dante, DEFORM, DOORS, Envision/Igrip, MAGMA, ProCast, SysWeld

Abinitio, Arena, Caliber, Dante, DEFORM, DOORS, Extend, MAGMA, Pro/Model, ProCast, Simul8, SysWeld, TaylorED, Witness

Abinitio, Dante, DEFORM, MAGMA, ProCast, SysWeld

Manufacturing System Visualization

Functional Prototyping

Functional Prototyping

Functional Prototyping

CimStation, Envision/Igrip, Functional Prototyping

Arena, Extend, Functional Prototyping, Pro/Model, Simul8, Taylor ED, Witness

Abinitio, Functional Prototyping, MAGMA, ProCast, SysWeld

Functional Prototyping

Reliability Models

RDD-ITDC, RDD-SD

Functional Prototyping, RDD-ITDC, RDD-SD

DEFORM, DisCom2, Functional Prototyping, RDD-ITDC, RDD-SD

CASRE, Functional Prototyping, RDD-ITDC, RDD-SD

JMP, Minitab, SAS, WinSMITH

RDD-ITDC, RDD-SD

Logistics

Eclipse ERP, Integrated Analysis, RDD-ITDC, RDD-SD

Integrated Analysis, RDD-ITDC, RDD-SD

RDD-ITDC, RDD-SD

Integrated Analysis, RDD-ITDC, RDD-SD

Integrated Analysis

JD Edwards, Logistics, Manugistics

Integrated Analysis, RDD-ITDC, RDD-SD

Purchasing

Purchasing plus

I2, Invensys, JD Edwards, Oracle, PeopleSoft, PTC, SAP

Supervisory Control

QUEST

Invensys, Siemens

Machine Control

Virtual NC

Labview, MATLAB, Unigraphics

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

  • Program management tools such as configuration management, risk management, and cost and schedule control

  • Data repositories storing integrated data sets

  • Communications networks giving participants inside and outside the organization secure access to data

As shown in Table 3-1, most of these tools exist today, but an AEE is more than just a collection of independent tools. Tools must be integrated to provide interoperability and data fusion. Organizational and interorganizational structures must be configured to reward their use and workforce skills must be enhanced to make effective use of their capabilities.1

The Carnegie Mellon University Software Engineering Institute (SEI) studied the use of AEEs and concluded that they exist within a broad domain, across all aspects of an organization. AEEs provide comprehensive coverage of and substantial benefits to design and manufacturing activities:

  • Office applications such as word processing, spreadsheets, and e-mail, are already familiar to nearly everyone.

  • Computer-aided design and integrated solid modeling not only improve the quality of the engineering product but also provide the basis for the exchange of product data between manufacturers, customers, and suppliers.

  • Computer-aided engineering enables prediction of product performance prior to production, providing the opportunity for design optimization, reducing the risk of performance shortfalls, and building customer confidence.

  • Manufacturing execution systems provide agile, real-time production control and enable timely and accurate status reporting to customers.

  • Electronic data interchange provides up-to-date communication of business and technical data among manufacturers, customers, and suppliers.

  • Information security overlays all operations to keep data safe.

Figure 3-1 is a modification of an SEI chart presented to the committee that helps show the widespread and pervasive use of software that bridges many functions and levels throughout the design and manufacturing enterprise. Enterprise viewpoints concentrate on near-term, mid-term, and far-term perspectives in the context of factory floor execution, tactical analysis, and strategic thinking, respectively.

A product's evolution typically is split into many phases to show its various stages, and most tools can be categorized in terms of the temporal nature of their use. In this case, the committee has elected to view a product's life cycle as shown here in seven stages, from mission needs to field operations. Figure 3-1 shows that there is little overlap between manufacturing modeling and simulation tools, or manufacturing process planning, and engineering design tools, reflecting the lack of interoperability between these steps with currently available software.

Many vendors sell tools that are now beginning to offer intriguing solutions toward overlap of key functions. Table 3-1 shows representative examples of some of these tools now being used in industry.2 For example, to address CADCAE interoperability, process integration and

1

National Research Council, Advanced Engineering Environments: Achieving the Vision, Phase 1, National Academy Press, Washington, D.C., 1999.

2

In addition, Appendix C describes some of the current engineering design tools and Appendix D provides a list of representative vendors of computer-based tools used for design and other functions.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-1 Overlay of tools that bridge design and manufacturing. Each ellipse within the chart represents a different tool category. Ellipse size connotes the comprehensiveness of the capabilities of those tools within the matrix, and color shading (or lack thereof) highlights the focus of the various tools' strengths in design, manufacturing, business operations, or management. Blue shades indicate a concentration in design, while green trends into manufacturing. Yellow hues show a proclivity toward business operations. Orange indicates the prominence and importance of data management. Ellipses void of color detail project management functions. Source: Special permission to reproduce figure from "Advanced Engineering Environments for Small Manufacturing Enterprises," © 2003 by Carnegie Mellon University, is granted by the Software Engineering Institute.

design optimization software tools that bundle discrete tools in order to facilitate multiprocess optimization are being introduced. Examples of such software are Synaps/Epogy, Isight, and Heeds from Red Cedar Technology. While these software packages look attractive in principle, human input still becomes essential to bridge the gaps between various analytical tools.

This chapter covers in depth the state of affairs within each of five different tool categories:

  • The section titled "Systems Engineering Tools" explains how philosophies are expanding from narrow discrete-element minimization to design-trade-space optimization strategies and, while many tools exist within their own specialized field, recommends the need for supervisory control and common links between individual routines.

  • "Engineering Design Tools" discusses the current capabilities of engineering design methods and software and their general lack of interoperability. It makes recommendations to improve communication between design and manufacturing software so that engineering models can be exchanged and simulated in multiple

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

environments.

  • "Materials Science Tools" describes how properties of materials limit the design process and recommends improved physical models and property databases to support virtual design and manufacturing.

  • "Manufacturing Tools" portrays advances in software ranging from detailed process planning and simulation models through production and enterprise management systems, focusing explicitly on issues related to the scope and scale of tools to design for X (DfX), where X is a variety of manufacturing parameters. It recommends organizational and algorithmic approaches for addressing obstacles.

  • "Life-Cycle Assessment Tools" measures the total environmental impact of manufacturing systems from the extraction of raw materials to the disposal of products and evaluates product and process design options for reducing environmental impact.

SYSTEMS ENGINEERING TOOLS

The phrase, topic, and discipline "systems engineering" in the context of industrial manufacturing has evolved over the last several decades such that it now includes more topics and encompasses a far greater portion of the product life cycle than Henry Ford probably could have envisioned in 1914. By 1980, systems engineering thinking in this context was expanding; but it was still essentially limited to the industrial engineering skill of maximizing production to minimize cost by minimizing the time required to perform each individual manufacturing step or assembly action. The underlying assumption was that minimizing the time required for each discrete event would also minimize the total cost to manufacture an item. As such, the concepts were not applicable until production commenced and then they were only applied to minimize the cost after the product was designed and the manufacturing process or assembly line was defined. Systems engineering and the discrete event minimization strategy in the early 1900s could not have predicted Henry Ford's departure from a traditional batch assembly philosophy to the assembly line concept. Even though the resultant unit cost was dramatically reduced, the significant increases in time to first article, cost to design, and cost of construction were seen as insurmountable barriers. The assembly line was an unpredictable revolutionary change from the evolutionary manufacturing improvements associated with discrete event minimization.

During the last two decades, systems engineering has evolved to include the cost of automated machine tools as alternatives to labor and has developed several very different cost profiles; but the optimizations were still being performed at the simple part or discrete work element level. And the evaluations were being conducted on an essentially static, or already designed and about to be built, factory. While computers had become readily available in the 1980s, there were no fundamental changes in the process of minimizing the discrete events to minimize the total cost. The computers only crunched more numbers. Today's hardware and software are capable of simulating multiple, if not essentially unlimited, factory designs and equipment variations, giving the systems engineer the ability to affect both prior to a factory's construction.

When the full costs of labor, shipping, and work in process are included in the evaluations, the systems engineer can also affect the manufacturing site selection. But the same discrete-element minimization mentality remains. Current thinking and research in systems engineering are beginning to expand the scope from focusing on discrete work elements to analyzing entire operations, lines, factories, or enterprises to optimize the total cost of a given design or set of designs. With the continuously increased speed and lowered cost of computing, this is generally possible. But the task is being performed by brute-force methodology whereby all known permutations and combinations of discrete events are tried and all but the best are excluded.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-2 Expanded systems engineering phases. Source: B.S. Blanchard and W.F. Fabrycky, Systems Engineering and Analysis, 3rd Edition, © 1998. Reprinted by permission of Pearson Education Inc., Upper Saddle River, N.J.

During the last 10 years, systems engineering has matured to the point that it is not an uncommon degree program in universities. Industry and defense both utilize the discipline, and there is a globally recognized organization that represents the practitioners. The International Council on Systems Engineering (INCOSE) defines the subject as "… an interdisciplinary approach and means to enable the realization of successful systems." Further, INCOSE lists seven functional areas included in systems engineering (operations, performance, testing, manufacturing, cost and scheduling, training and support, and disposal).3

Blanchard and Fabrycky bring many of the systems engineering concepts and phases together in their book as shown in Figure 3-2. Other authors have described systems engineering as having four (Figure 3-3), seven (Figure 3-1), and even eight (Figure 3-4) phases. The important factors to observe from all this are that systems engineering can include everything from determination of the need for a product to its disposal, and that there are significant overlapping phases (notably design and manufacturing) that require interconnections and the sharing of data and information.

Engineering Cost Analysis

The next logical advance is what is referred to as an engineering or technical cost analysis. In its simplest form, it may be no more than a spreadsheet listing the phases found in the product concept through product realization cycles on one axis and identifying the many functional areas, costs, or even software tools on the other. This committee elected to settle on seven phases and portray the traditional flow of effort (time) from left to right as shown in Table 3-1: function; mission or customer needs; product planning; product architecture; engineering design; manufacturing engineering; manufacturing operations; and field operations.

3

International Council on Systems Engineering. Available at: http://www.incose.org. Accessed April 2004.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-3 Life-cycle phases collapsed into four. Source: R. Garrett, Naval Surface Warfare Center, "Opportunities in Modeling and Simulation to Enable Dramatic Improvements in Ordnance Design," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., April 29, 2003.

A significant number of companies are already identifying where there is a need to communicate and work together both within the company divisions and with other companies. In its second-generation form, engineering cost analysis software will approximate the costs associated with each phase of the product development–realization cycle. In its ultimate form, the engineering cost analysis will include and improve upon all of systems engineering's current discrete event optimization functions; but, more importantly, it will extend forward in time to include accurate estimates for various design, material, and process selection options. In some instances, it may also include the determination of the optimum product concept to satisfy the intended customers' needs and cost constraints. Numerous presentations to the committee made the point that increased and improved communications between all phases will significantly reduce the time from concept to first article.

Some examples of time savings that have already been achieved were presented to the committee:4

  • Chrysler, Ford, and GM have reduced the interval from concept approval to production from 5 to 3 years.

  • Electric Boat has been able to cut the time required for submarine development in half—from 14 years to 7 years.

  • 38 Sikorsky draftsmen took 6 months to develop working drawings of the CH-53E Super Stallion's outside contours. With virtual modeling and simulation, a single engineer accomplished the same task for the RAH-66 Comanche Helicopter in 1 month.

  • 14 engineers at the Tank and Automotive Research and Development Center designed a low-silhouette tank prototype in 16 months. By traditional methods this would have taken 3 years and 55 engineers.

  • Northrop Grumman's CAD systems provided a first-time, error-free physical mockup of many sections of the B2 aircraft.

  • The U.S. Navy's modeling and simulation processes for the Virginia-class submarine reduced the standard parts list from ~95,000 items for the earlier Seawolf-class submarine to ~16,000 items.

It is necessary to provide the engineer at the CAD terminal with new and improved software tools that can give guidance regarding the life-cycle costs of each design decision in both preliminary and detailed design. For example, specific data could be made available to the designer regarding the alternative costs of various manufacturing approaches such as

4

M. Lilienthal, "Observations on the Uses of Modeling and Simulation," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., February 24, 2003.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-4 Life-cycle phases expanded into the eight indicated at the top of the figure. Source: A. Adlam, U.S. Army, "TACOM Overview," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., June 25-26, 2003.

automatic tape lay-up, injection molding, or electron beam welding, which could be selected to reduce unit manufacturing costs. In addition, reliability data for such proven components as hydraulic actuators, electrical connectors, and generators could easily be made available through interconnected databases to achieve a first-cut design that was reliable, maintainable, and low cost. This would be a huge step towards giving customers low total life-cycle costs.

It is critical to note, and generally ignored, that the geometrical shape of a part or assembly will determine the subsequent manufacturing processes by which it may be made and will, inadvertently, limit the materials to just those few that are suitable for those processes. This limitation has led to the rule of thumb that the majority of cost reduction opportunities are lost at the time a part is designed. Frequently, multiple design concepts or design/material combinations will satisfy a desired function. For that reason, it is imperative that all design options, along with their associated manufacturing processes and materials, be evaluated prior to committing to a final design strategy.

Again, several presentations to this committee emphasized the importance of improving the assessment of needs and the exploration of the design trade space and a means to minimize total life-cycle costs. Figure 3-5 shows one author's views on when and where the full cost of a product is locked in. Other authors provided additional guidelines to support the value of up-front design analysis.

Some guidelines presented to the committee5 , 6 are listed below:

5

J. Hollenbach, "Modeling and Simulation in Aerospace," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., February 24, 2003.

6

A. Haggerty, "Modeling the Development of Uninhabited Combat Air Vehicles," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., April 29, 2003.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-5 Product cost locked in very early in process. Source: M. Lilienthal, Defense Modeling and Simulation Office, "Observations on the Uses of Modeling and Simulation," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., February 24-25, 2003.

  • Continue the early collaborative exploration of the largest possible trade space across the life cycle, including manufacturing, logistics, time-phased requirements, and technology insertion.

  • Perform assessments based on modeling and simulation early in the development cycle—alternative system designs built, tested and operated in the computer before critical decisions are locked in and manufacturing begins.

  • Wait to develop designs until requirements are understood.

  • Requirements are the key. Balance them early!

  • Once the design is drawn, the cost and weight are set.

  • No amount of analysis can help a bad design get stronger or cheaper.

  • Remember that 80 percent of a product's cost is determined by the number of parts, assembly technique, manufacturing processes, tooling approach, materials, and tolerances.

The linkages between design, manufacturing, and materials, combined with the value of reaching the customer in the least amount of time, support a robust business case for quick development of initial products. This increased effort at the start would be at a higher than optimal initial cost, but with scheduled updates and design changes would result in future improved reliability and cost while maintaining service part commonality. This approach could also result in the discovery of design and manufacturing strategies corresponding to an immediate need; when the product development cycle is shortened, products can be designed

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

to be more responsive to specific customer requirements.7 Design and manufacturing concepts that factor in the initial cost to design and manufacture as well as the cost to maintain production and service parts far into the future could be discovered as well.

In its early forms, engineering cost analysis will be forced to simplify the details of most steps of the process, from concept to realization, by assuming generalized time and cost models. While this may seem crude, a manufacturing engineer frequently can make a relatively accurate estimate of a part cost based on its general shape, size, and function, just as a product engineer can provide a similarly accurate estimate of the cost to design a product (or number of parts) based on its complexity, size, and intended use. In this early form, the engineering cost analysis is unlikely to provide an accurate final cost, but it is expected to accurately rank the various options examined. As computing power and cost continue to follow Moore's law and as individual program data input/output structures are modified to complement each other, more refinements and accuracy will be obtained and more options may be explored. However, today's state of the art in cost analyses is still inadequate. Development of refined cost analysis tools is vitally needed in the aerospace industry where "dollars per pound" of airframe is still used for many calculations.

Whether engineering cost analysis is only another in a series of evolutionary improvements in the industrial systems engineering business process or truly a step function improvement, there will still be significant unknowns and effort required to bring it to fruition and realize its value. Unifying the description of parts (two-dimensional vs. three-dimensional, or solid vs. surface), characterizing manufacturing process effects, designing and developing materials and verifying their properties, creating complementary interprogram data structures, and developing virtual visualization tools will not be easy and will require significant research. However, these developments still only require appropriate funding, time, and discipline to complete. In order to take advantage of these developments, it will be necessary to change the design and manufacturing business culture, so that it focuses on the total life-cycle cost rather than the cost of discrete events. Without this change, designers and manufacturing engineers will remain within their disciplines and continue to suboptimize their portion at the expense of the whole.

Manufacturing Cost Modeling

During conceptual design and concurrent with all other design-for-X (DfX) activities, the life-cycle cost of a product should be addressed. Twenty years ago engineers involved in the design of products may not have concerned themselves with the cost-effectiveness of their design decisions; that was someone else's job. Today the world is different. All engineers in the design process for a product are also tasked with understanding the economic trade-offs associated with their decisions. At issue are not just the manufacturing costs but also the costs associated with the product's life cycle.

Several different types of cost-estimating approaches are potentially applicable at the conceptual design level where engineering decisions about the technology and material content of a product are made.

Traditional material cost analysis uses parametric methods to determine the quantity of a material required in a product. The model then applies a cost policy that includes how the manufacturer does quoting, inventory methods, and method of purchase of commodity materials. To determine the total manufacturing cost, material costs are combined into traditional cost accounting methods where labor costs are included and overhead is applied. Variations in how material, labor, and overhead costs are computed and combined abound and

7

An example might be a switch from a desert to an arctic conflict or simply between armed conflicts.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

are summarized in the following paragraphs.

Activity-based costing8 (ABC) focuses on accurate allocation of overhead costs to individual products. Other methods include function or parametric costing9 in which costs are interpolated from historical data for a similar system. Similarly, empirical or cost-scaling methods10 are parametric models based on a feature set. Parametric-based models are applicable to evolutionary products where similar products have previously been constructed and high-quality and large-quantity historical data, exist.

Sequential process flow models11 attempt to emulate the actual manufacturing process by modeling each step in sequence and are particularly useful when testing, reworking, and scrapping occurs at one or more places in the process. Resource-based cost modeling12 assesses the resources of materials, energy, capital, time, and information associated with the manufacture of a product and aims to enable optimum process selection. Resource-based modeling is similar to the specific process step models embedded within sequential process flow models. Each process step model sums up the resources associated with the step—labor, materials, tooling, equipment—to form a cost for the step that is accumulated with other steps in sequence. Resource-based modeling is the same as sequential step modeling except that use of a specific sequence is not necessarily required.

Technical cost modeling carries cost modeling one step further by introducing physical models associated with particular processes into the cost models of the actual production activities. Technical cost modeling13 also incorporates production rate information.

With the advent of products such as integrated circuits, whose manufacturing costs have smaller materials and labor components and greater facilities and equipment (capital) components, new methods for computational cost modeling have appeared such as cost of ownership (COO).14 Cost of ownership modeling is fundamentally different from sequential process flow cost modeling. In a COO approach, the sequence of process steps is of secondary interest; the primary interest is determining what proportion of the lifetime cost of a piece of equipment (or facility) can be attributed to the production of a single piece part. Lifetime cost includes initial purchase and installation costs as well as equipment reliability, utilization, and defects introduced in products that the equipment affects. Accumulating all the fractional lifetime costs of all the equipment for a product gives an estimate of the cost of a single unit of the product. Labor, materials, and tooling in COO are included within the lifetime cost of particular equipment.

As one might expect, there are pros and cons associated with all the approaches outlined above. Also, nearly all of the basic manufacturing cost models are supplemented by yield models, learning curve models, and test/rework economic models. Many commercial vendors exist for manufacturing cost modeling tools. Some examples are listed here:

8

P.B.B. Turney, "How Activity-Based Costing Helps Reduce Cost," Journal of Cost Management for the Manufacturing Industry, Vol. 4, No. 4, pp. 29-35, 1991.

9

A.J. Allen, and K.G. Swift, "Manufacturing Process Selection and Costing," Proceedings of the Institute of Mechanical Engineers Part B—Journal of Engineering Manufacture, Vol. 204, No. 2, pp. 143-148, 1990.

10

G. Boothroyd, P. Dewhurst, and W.A. Knight, Product Design for Manufacture and Assembly, Marcel Dekker, Inc., 1994.

11

C. Bloch and R. Ranganathan, "Process Based Cost Modeling," IEEE Transactions on Components, Hybrids, and Manufacturing Technologies, pp. 288-294, June 1992.

12

A.M.K. Esawi and M.F. Ashby, "Cost Ranking for Manufacturing Process Selection," Proceedings of the 2nd International Conference on Integrated Design and Manufacturing in Mechanical Engineering, Compiègne, France, 1998.

13

T. Trichy, P. Sandborn, R. Raghavan, and S. Sahasrabudhe, "A New Test/Diagnosis/Rework Model for Use in Technical Cost Modeling of Electronic Systems Assembly," Proceedings of the International Test Conference, pp. 1108-1117, November 2001.

14

R.L. LaFrance and S.B. Westrate, "Cost of Ownership: The Suppliers View," Solid State Technology, pp. 33-37, July 1993.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

  • Cognition—process flow cost modeling

  • Wright Williams & Kelly—cost of ownership modeling

  • Savantage—conceptual design cost modeling

  • ABC—ABC cost modeling

  • IBIS—technical cost modeling

  • Prismark—Niche parametric manufacturing cost modeling (for printed circuit boards)

Another area that is often overlooked is cost modeling for software development. Systems are a combination of hardware and software. Ideally hardware and software are "codesigned." Codesign allows an optimum partitioning of the required product functionality between hardware and software. Cost needs to be considered when these partitioning decisions are made. Several commercial tools allow the cost of developing new software, qualifying software, rehosting software, and maintaining software to be modeled. Historically, many of these tools are based on a public domain tool called COCOMO15 and later evolutions of it.

Life-Cycle Cost Modeling

While manufacturing costing is relatively mature, life-cycle cost modeling is much less developed. For many types of products, manufacturing costs only represents a portion, sometimes a small portion, of the cost of the product. Nonmanufacturing life-cycle costs include design, time-to-market impacts, liability, marketing and sales, environmental impact (end of life), and sustainment (reliability and maintainability effects). While reliability has been addressed by conventional DfX activities, rarely are other sustainability issues such as technology obsolescence and technology insertion proactively addressed.

There are existing commercial tool vendors in the life-cycle cost modeling space as well; for example:

  • Price Systems: parametric life-cycle management costing

  • Galorath: parametric life-cycle management costing (SEER tools)

  • NASA: well developed parametric cost modeling capabilities

One particular example of a life-cycle cost contributor for many types of systems is the lack of design-level treatment of technology obsolescence and insertion. This problem is already pervasive in avionics, military systems, and industrial controls and will become a significant contributor to life-cycle costs of many other types of high-technological-content systems within the next 10 years. Unfortunately, technology obsolescence and insertion issues cannot be treated during design today because methodologies, tools, and fundamental understanding are lacking.

Systems Engineering Issues

Box 3-1 shows four case studies of the use of systems engineering software tools in industry. In addition, the National Defense Industrial Association Systems Engineering Division Task Group Report,16 issued in January, 2003, listed the top five issues in systems engineering:

15

B.Boehm, Software Engineering Economics, Prentice-Hall, Inc., Upper Saddle River, N.J., 1981, p. 1.

16

The National Defense Industrial Association, The National Defense Industrial Association Systems Engineering

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

  • Lack of awareness of the importance, value, timing, accountability, and organizational structure of systems engineering programs

  • Lack of adequate, qualified resources within government and industry for allocation within major programs

  • Insufficient systems engineering tools and environments to effectively execute systems engineering within programs

  • Inconsistent and ineffective application of requirements definition, development, and management

  • Poor initial program formulation

Similarly to curriculum modification in the educational system, the business and employee reward system will also need overhauling to ensure that it rewards those who think strategically rather than those who function in the old but safe ways. So long as the ones receiving the greatest rewards are the designers who turn out the greatest number of prints and models, or the purchasing agents who negotiate the lowest price for a given part, process, or material, no one can justify spending any significant amount of time or effort in the development of a better method to achieve design goals.

Initially, this cultural change may require a totally separate organizational unit with a reward structure tailored to recognize enterprise successes rather than discrete events. To be fully successful, this new culture ultimately has to infect all levels and units of an organization. For that to happen incentives are needed for the manufacturing leadership to change both itself and the culture. The saving aspect of such a sea-change is that, once it is a part of the culture, all the participants—both old and new—will win. The need to focus a portion of product development on minimizing time and costs in the traditional ways will remain but will be incorporated into the larger picture.

Systems Engineering Opportunities

While the committee found many areas where there is need for data structure, program interconnectivity, and visualization, the entire area of systems engineering presents new opportunities to rethink what process an organization or institution utilizes as it proceeds from the needs assessment phase through design and manufacturing to use, support, and ultimately disposal. Several significant and valuable characteristics become routine through the rigorous application of systems engineering. A few of the most significant are listed here:

  • Forces discipline

  • Creates multidimensional, conceptual design trade space

  • Enables multidisciplinary optimization

  • Promotes and requires interoperability

Other related factors include the following:

  • Forces cultural change in rewards

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

  • Tools exist but not in widespread use

  • Training required (academic and industrial)

Box 3-1
Four Case Studies on the Use of Systems Engineering Software Tools

Case Study 1

A simple example that illustrates trade-offs in design is the device that locates and carries the bearings and wheels that support the track on a tracked vehicle (such as a tank). This carrier can be designed either as an assembly of welded plates or as a casting. The welded version can be designed and placed into production in a fraction of the time required for a casting but is more costly to produce. An engineering cost analysis could predict that, for initial production and for long-term future service parts availability, the welded assembly would be preferred. That same analysis could predict that for many large-scale production applications, a casting would be preferred. The costs and value of each option could be evaluated and the lowest total cost option selected. And that analysis could predict that both options need to be developed: the welded version for initial production, and a scheduled update to replace it with a casting for the majority of its production life followed by reverting to a weldment when it goes into service-parts-only production.

Case Study 2

Siemens Transportation Systems utilized collaborative virtual product development software to create a complete cross-functional product definition and system-level simulation environment to validate total product functionality during the crucial concept phase of railway car manufacturing. Heinz-Simon Keil, Department Head, Corporate Technology, Production Processes Virtual Engineering: "… functional prototyping has enabled Siemens Transportation Systems to accelerate the overall virtual prototyping process and correct potentially costly errors on the fly before such errors are discovered in manufacturing." a

Case Study 3

Conti Temic Product Line Body Electronics utilized software tools to standardize model-based development for electronic control units (ECUs). It selected an integrated tool suite, the Statemate MAGNUM and Rhapsody in MicroC tool chain (or suite), to graphically specify designs, improve communication within its development team and with customers, reduce time to market through component reuse, and reduce costs by validating systems and software designs up front prior to implementation. This integrated tool suite provides Conti Temic with a development process from requirements to code, ensuring its applications are complete and accurate.

Conti Temic Body Electronics committed to the model-based approach due to the complexity and diversity of body control ECUs. Conti Temic system engineers create formal requirements specifications and then test their models in a virtual prototyping environment, ensuring that their ECU models are error free. The system behavior is validated as an integral part of the design process,

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

before anything is built. Conti Temic is able to execute, or simulate, the design—either complete or partially complete—prior to implementation. Analyzing its specifications up front, Conti Temic ensures that behavior is correct and captures test data that will be used later to test the implementation. Conti Temic benefits from the ability to automatically generate high-quality target code, easily capture and use features or functions throughout the development process, and automatically generate documentation from the completed specification model. Working with supplier requirements, Conti Temic is able to visually express systems functionality and ensure, up front in the design process, that the product will meet the specifications.

In many cases, automotive original equipment manufacturers (OEMs) provide Conti Temic actual requirements models. The use of a standard tool between OEM and supplier could greatly reduce miscommunication common within the automotive industry. "In addition to facilitating communication, Conti Temic Body Electronics is able to quickly make changes and ensure their designs are solid if issues arise in ECU integration, functionality is changed late in the game, cost reductions are mandated, or variant derivation has to be incorporated and validated within the product," said Andreas Nagl, Software-Engineering, Conti Temic Product Line Body Electronics. "Due to the complexity of our products, the synchronization effort for requirements, analysis/design models, code and test cases increased exponentially with conventional static CASE tools. The Statemate MAGNUM and Rhapsody in MicroC tool chain (or suite), unlike static CASE tools, allows us to 'feel' the systems behavior, make changes on the fly, validate our design, generate code and conduct tests much more quickly." b

Case Study 4

An aerospace contractor used requirements management software to model the systems architecture of a winner-take-all proposal to build a new cost-efficient AWACS aircraft for the Royal Australian Air Force. The contractor realized that using its standard document-driven methodology would not be successful and so it modeled the architecture at both a high level of abstraction and according to scenarios with lower levels of fidelity. The contractor also modeled its business processes and other key mission-critical items. This allowed the company to uncover and rectify many unforeseen issues significantly earlier in the design process and to reduce the risk to the proposal. The company realized that the use of both static and dynamic modeling had become indispensable to reducing program risk.

The contractor also decided to build multiple segmented and secure baselines for future use because it wanted to capture all of the key design knowledge of this project for use in developing future proposals. This data included classified, vendor proprietary, and customer-specific information that could not be shared in a single repository. This approach reflects the understanding that the ability to capture and reuse design knowledge is critical for long-term program evolution.

In the first design review, the customer found no discrepancy reports and the results of the final design review mirrored the first. The contractor team won the proposal on both technical merit and cost. The cost savings were so significant that the customer was able to purchase additional options once thought beyond its budget. Key to this success was project-level collaboration among all team members—systems engineers, software engineers, and managers—including translating among subcontractor design process methods and standards.

ENGINEERING DESIGN TOOLS

Engineering design is the process of producing a description of a device or system that will provide a desired performance or behavior.Figure 3-6 is a typical chart depicting the steps in this process. The process begins with the determination or assessment of a need. Next, a set of requirements and constraints is established for the device or system, often also including a list of desired performances. At this stage in the process, the engineering team begins to

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

propose alternative configurations. A series of calculations and analyses are performed to estimate how well each of the proposed configurations will satisfy the requirements and constraints, and deliver the desired performances. These analyses typically range from hand calculations to simple analytic expression to more sophisticated modeling and simulation, such as finite element modeling, thermal modeling, and computational fluid dynamics.

The results of these simulations are interpreted by the design engineers and used to refine the proposed configurations, and to reject those that appear to not satisfy the requirements and constraints. It is common to construct at least one prototype to verify the simulated performance of the proposed configuration(s). In this process, design engineers focus their attention primarily on achieving the desired performances, while meeting the requirements and constraints. Questions of manufacturability are typically a secondary consideration during the design process.

Engineering design is an iterative process between every step in the design process. Figure 3-6 does not proceed linearly downward. Instead, there are iterations back and forth between all the steps in the engineering design process, as shown by the long column box at the left in Figure 3-6.

Once a design configuration is selected, analyzed, simulated, prototyped, and validated, the design information is passed to the manufacturing engineers to design the manufacturing systems and processes to fabricate the design in the desired quantities. This manufacturing engineering process entails many of the same steps as in the engineering design process, including the application of sophisticated modeling and simulation.

Relationship of Engineering Design Tools to Manufacturing

Since the device or system must be fabricated in order to deliver a desired performance or behavior, engineering design should be closely linked to manufacturing. Indeed, frequently questions of manufacturability severely limit the range of design options available. However, in current engineering practice, as described above, the link between design and manufacturing is largely informal, based on the knowledge and experience of the engineers involved. While many engineering design and analysis software packages exist, and several powerful manufacturing simulation software packages exist, the link between these two domains remains weak. Appendix C describes some of the current engineering design tools, and Appendix D provides a list of representative vendors of computer-based tools used for design and other functions.

Current Status of the Bridge Between Design and Manufacturing

Figure 3-1 depicts the current status of the link, or bridge, between engineering design and manufacturing. The columns represent many of the identifiable stages in the design of a new product or system, with time proceeding from left to right. The rows indicate whether the tool or process depicted in the diagram contributes to strategic planning and decisions, or applies to tactical decisions and steps, or plays a role in carrying out the execution of the stage in the process. The width of the ellipse surrounding the name of each tool indicates the stages in the process in which the tool can play a role. The height of the ellipse indicates where the tool is effective, along the range from purely strategic planning to execution of individual process steps. Blue ellipses indicate engineering design processes. Green ellipses indicate manufacturing-related processes. Yellow ellipses indicate business functions, orange ellipses indicate data management, and empty ellipses represent overarching processes. The degree of overlap between ellipses indicates how much interaction there should be between tools.

As can be seen in Figure 3-1, engineering modeling, simulation, and visualization can play a strategic role in product architecture planning, and also a tactical role in guiding the detailed design processes. The detailed design process is initiated from the results of product

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-6 Steps in the engineering design process. Source: Adapted from G. Pahl and W. Beitz, Engineering Design, p. 41. Copyright © 1984 Springer-Verlag. Reprinted by permission of Springer-Verlag GmbH, Heidelberg, Germany.

architecture planning, and it results in a precise description of the product to be manufactured. The diagram shows that there is little overlap between manufacturing modeling and simulation tools, or manufacturing process planning, and engineering design tools, reflecting the lack of interoperability between these steps with currently available software. However, there are efforts being made to change this situation.

Box 3-2 shows a case study of an integrated design approach applied to unmanned undersea vehicles (UUVs). Past research and development activities to create a highly flexible

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

and responsive design environment include the Defense Advanced Research Projects Agency's (DARPA) Rapid Design Exploration and Optimization (RaDEO) program.17 Unigraphics NX is one example of an end-to-end product development solution for a comprehensive set of integrated design, engineering, and manufacturing applications.18

Roles for Computational Tools

As Figure 3-1 illustrates, the intercommunication and interoperation of engineering design and manufacturing computational tools can play a key role in establishing a strong bridge between design and manufacturing. During the study, the committee received multiple presentations that outlined a future vision of a stronger bridge and identified key difficulties that such a bridge could overcome, including:

  • Multiple models: During the engineering and manufacturing process, each stage of the process, and each engineering discipline, typically employs computational tools, and builds computational models, that are unique to that activity. A key challenge is the time and effort required to create multiple models, and the difficulty in ensuring that the models are consistent. It is common for a change in configuration of a design to be introduced in one model but not propagate to others, leading to inconsistent analyses of performance and failures in the field.

  • Design reuse: Because of the large number of noninteroperating engineering design and manufacturing software packages in use, it is not common for models of earlier similar designs to be reused and improved. Typically, a new model of an improved component or subsystem is constructed. In this environment, the incentive for engineers to reuse portions of earlier designs is limited.

  • "No-build" conditions: One of the key roles for a stronger bridge between engineering design and manufacturing was repeatedly identified during the course of the study as the need, particularly early in the engineering design process, to be able to easily identify designs that cannot be fabricated or assembled (no-build conditions). Without the ability to analyze early designs for their capability to be fabricated and assembled, unrealizable designs can persist late into the design process.

  • Early manufacturing considerations: Decisions made during the early engineering design stages typically determine 70 to 80 percent of the final cost of a product. Without a strong bridge between design and manufacturing, these decisions are commonly made with little information relating to costs and degree of difficulty of fabrication and assembly of a proposed design. This lack of manufacturing information in the early engineering design decision-making process is a key contributor to cost overruns late in the development cycle of a product.

Advancing the interoperability19 and composability20 of design and manufacturing software, particularly modeling and simulation, will contribute significantly to reducing the

17

Defense Advanced Research Projects Agency, Rapid Design Exploration and Optimization. Available at: http://www.darpa.mil/dso/trans/swo.htm. Accessed February 2003.

18

Unigraphics PLM Solutions, an EDS company, NX: Overview. Available at: http://www.eds.com/products/nx/. Accessed May 2004.

19

In this context, interoperability is the ability to integrate some or all functions of more than one model or simulation during operation. It also describes the ability to use more than one model or simulation together to collaboratively model or simulate a common synthetic environment.

20

In this context, composability is the ability to select and assemble components of different models or simulations in various combinations into a complex software system.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

difficulties enumerated above.

  • Multiresolution21 interoperable models will reduce or eliminate the problems of multiple models and inconsistent analyses.

  • Intercommunication between multiple engineering and manufacturing software packages will greatly enhance the ability of engineers to retrieve models of earlier designs and establish a starting point for a next-generation product.

Box 3-2
Integrated Design of Unmanned Undersea Vehicles

Technological innovations are facilitating the expanded use of UUVs to perform complex and dangerous missions. a Improvements in sensors, guidance and control, power systems, and propulsion systems have dramatically improved the functionality, flexibility, and performance of these vehicles. As a result, commercial and military interest in these vehicles and in the potential dual-use capability of these enabling technologies is keen. Another important technological innovation involves the design process itself.

Researchers at the Applied Research Laboratory (ARL) at Pennsylvania State University recently developed an integrated design process utilizing advanced computational methods and successfully applied it to develop a UUV system. Development of a new design methodology was made necessary by contractual requirements that mandated substantially shorter design time and lower design and ownership costs than traditional design methodology could deliver. Another inducement was the expectation that this investment in developing an integrated system could spill over to other applications and projects in the future. By several measures the effort was a success, reducing project cost and development time from 3 to 4 years to 12 to 18 months.

The integrated design tools developed by ARL minimize the need for numerous and expensive interim experiments, relying more on fundamental, physics-based computational models of fluid dynamics. The design path is conceptually similar to conventional design paths, with tunnel testing replaced with computer simulations. For example, the propulsor design and analysis tool (PDAT) and Reynolds-averaged Navier-Stokes (RANS) analysis are used extensively to simulate the drag and stability of a vehicle under different design parameters, such as length, diameter, and nose and tail contours. Computer automated design is also used for mechanical design and structural analysis. The maneuverability of the vehicle is simulated using an ocean dynamics model and pitching motion simulations, essentially replacing physical with numerical experiments.

The first vehicle designed with this integrated approach was completed in 14 months below projected cost and verified at Lake Erie. The UUV requirements, however, are generally far less demanding than many other defense and commercial products. The mission requirements for the UUV include low speed, variable payloads, a small fleet of 15 to 20 vehicles, and long-endurance missions. The design criteria, in order of importance, include reliability, cost, maneuverability, efficiency, and stealth. Simulation-based design approaches are also under development for advanced torpedoes and submarines but are likely to be more challenging because stealth and other design criteria that are more difficult and costly to achieve are relatively more important in these applications.

In committing to such an effort, ARL had to assume the risk of failure to deliver the product by the imposed deadline at the contractual cost. Risk, therefore, is a major consideration in devoting the necessary time and money to develop integrated design tools. In ARL's case, it had the experience and scientific knowledge to assess these risks and decided that they were worth assuming given the expected benefits, including the immediate goal of meeting the contractual specifications and the longer-term goals

21

Multiresolution modeling is defined as the representation of real-world systems at more than one level of resolution in a model with the level of resolution dynamically variable to meet the needs of the situation.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

of enhanced design capability and the greater likelihood of future contracts arising from improved goodwill with the project sponsors.

This case study illustrates that the keys to success of a new design strategy are advanced system requirements, reduced traditional development cycle times, reduced design and development costs, and reduced life-cycle or total ownership costs. These can be accomplished by early application of the systems design approach to increase the number and fidelity of design trade-off analysis.

The development of the UUV offers several lessons for bridging the gap between design and manufacturing. The physics-based models developed at ARL will play an important role in future integrated design efforts, particularly in simulating product performance. The UUV project at ARL also illustrates that tool development is not the result of a targeted research and development program but rather a means to an end. Perhaps this approach provides for more efficient tool development and integration. On the other hand, the lack of targeted research and development funding for tool integration may hinder the development of truly path-breaking integrated design simulation tools.

Another constraint encountered by ARL was computer resources. The computations were performed using networked computer workstations. Supercomputer resources were accessible but not as available on a real-time basis. Both computing environments, however, pose limits on the degree of accuracy in the models, particularly those that model cavitation and acoustics. As computer speeds increase in the future, modeling these processes at finer resolution will be possible.

Finally, the availability of U.S. citizens to work on these projects is limited and poses some real constraints on system integration development. This last constraint raises a number of training and education issues that may be particularly vexing for policy makers as they seek ways to foster a more efficient design process.

a

M.J. Pierzga, "ARL Integrated Design Approach Using Computational Fluid Dynamics," and C.R. Zentner and W.M. Moyer, "Unmanned Undersea Vehicle Technology," Applied Research Laboratory Review, Pennsylvania State University, University Park, Pa., pp. 21-22, May, 2001.

  • Testing and resolving no-build conditions early in the design process will reduce or eliminate the costs of maturing a design that cannot be fabricated or assembled late into the design process.

  • Perhaps the most critical improvement that can result from the integration of design and manufacturing models is that early design decisions can be made with consideration of the manufacturing ramifications. This integration can have a dramatic impact on the total product cost.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

MATERIALS SCIENCE TOOLS

The goal of materials science and engineering is to link the structure and composition of materials with their manufacturing and properties—in other words, to develop models for materials behavior and performance during and after manufacturing. To achieve this, materials engineers utilize various tools that extend the entire length scale—from the electronic through the continuum scale. In fact, a long chain of successful models punctuates the history of materials science. Some models, such as those based on semiconductor device theory, rely on fundamental physics; others, such as annealing curves, arise from phenomenology; and most, such as phase diagrams, combine theory and observation. Within the past two decades, the subspecialty of computational materials science has provided additional modeling tools to the materials scientist. With applications ranging from empirical (expert systems) to fundamental (ab initio electronic structure calculations), computational materials science enables a more integral link between materials, design, and manufacturing as illustrated in Figure 3-7.

Current Tools

Research Codes

The purview of materials science and engineering is scientifically and technologically vast. Everything is made from materials, and the responsibilities of materials scientists range from the extraction of raw materials through processing and manufacturing to performance and reliability. Because of its broad scope, materials engineering is a decade or so behind other engineering disciplines in developing a core set of computational tools. For example, mechanical engineers are trained as undergraduate students in using finite element modeling (FEM) for heat and mass transfer, and a variety of commercial FEM packages are in wide industrial use. Materials scientists have no comparable computational training or tools. Even in cases where extensive scientific tools are available for prediction of basic materials properties or structures, their connectivity to real-life products and large-scale applications remains highly inadequate.

While computational materials science continues to progress, most of its applications remain research codes, with a few notable exceptions that are discussed below. Research codes are created for a variety of reasons, but their common characteristic is that they are written for a limited, specialized user base. Since users are assumed to be experts in both the scientific and computational aspects of the code, most research codes suffer from poor documentation, lack of a friendly user interface, platform incompatibility, no user support, and lack of extendibility. Furthermore, because research codes are usually written for a single purpose and customer, they may not even be numerically stable or scientifically correct. Of course, there are many examples of research codes that are well supported and responsive to customers, such as Surface Evolver, a code that calculates the wetting and spreading of liquids on surfaces.22 These are often labors of love, supported by a single researcher or group; the danger is that there is no guarantee of continuity of support as funding or personal circumstances change. On the positive side, many research codes are available without cost, and many researchers are delighted to reach new customers for their codes.

As a research code adds capabilities and demonstrates its utility to more users, it may graduate to a more sophisticated, stable, and supported application, either through commercialization23 or through the open-source paradigm.24 While a few materials science

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

codes have made this transition, none pervade the field, particularly within the ranks of the practicing materials designers and engineers.

Materials Science Models

While computational materials models are quite diverse, they can be loosely classified by length scale and scientific content into five categories: atomic-scale models, mesoscale models, continuum models, thermodynamic models, and databases. Within these categories, physics-based models, which utilize scientific theory to predict materials behavior, can be distinguished from empirical models, which rely on experimental observation and prediction of trends to do the same. Of course many models are hybrids, and include both fundamental science and empirical data.

Atomic-Scale Models. Atomic-scale simulations were among the first scientific computer applications. From simple early simulations, which treated atoms as classical hard spheres, these models have evolved into sophisticated tools for predicting a wide range of material behavior. As such, they represent a well-developed area of computational materials science.

Atomic-scale models are divided into two types: ab initio electronic structure calculations and empirical atomistic simulations. Electronic structure models solve simplified versions of the Schrödinger equations to generate the electron density profile in an array of atoms. From this profile, the position and bonding of every atom in the system can be determined. In theory, ab initio simulations provide a complete description of a system: its structure, thermodynamics, and properties for idealized situations and limited size.

In practice, ab initio simulations are limited in two ways. First, because the many-electron Schrödinger equation cannot be solved in closed form, all ab initio simulations utilize approximations. A major focus of researchers in this area is improving the approximations and quantifying their effects. However, their limitations remain important; for example, ab initio calculations are typically accurate to within 0.1 eV. This resolution limit can mean prediction of incorrect equilibrium crystal structures and errors in calculating melting points of up to 300 K. Second, electronic structure calculations are extremely computing intensive. Current heroic calculations may simulate 105 atoms for a nanosecond. Even with geometrically increasing computer resources, the capability to simulate a mole of atoms in an hour remains a long-term goal.

Despite their limitations, electronic structure calculations are valuable tools for elucidating the underlying physics of materials behavior. They have been particularly successful in calculating phase diagrams, crystal structures, solute distribution, and the structure and properties of internal defects. Because they include electronic bonding, ab initio simulations are the only first-principles method for predicting chemical interactions, including alloy chemistry, structure, and surface interactions. A large number of ab initio electronic structure codes have been developed, and several, such as VASP25 and WIEN2k,26 are commercially available and widely utilized. These simulations are still research codes; generating meaningful results requires graduate-level knowledge of both the technique and the particular code. However, both packages can be considered robust and supported tools.

24

An example is the ABINIT code, whose main program allows one to find the total energy, charge density, and electronic structure of systems. More information is available at: http://www.abinit.org/. Accessed April 2004.

25

G. Kresse, "Vienna Ab-initio Simulation Package," October 12, 1999. Available at: http://cms.mpi.univie.ac.at/vasp. Accessed April 2004.

26

P. Blaha, K. Schwarz, G. Madsen, D. Kvasnicka, and J. Luitz, "WIEN2k," 2001. Available at: http://www.wien2k.at/. Accessed April 2004.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

FIGURE 3-7 Models for linking design, manufacturing, and materials.

For larger atomic systems, up to 1010 atoms for a microsecond, scientists have developed empirical atomistic simulations. These models use a classical, empirically derived function for the potential energy between every pair (or in some cases, triplet) of atoms. These potential functions may be derived from the electron density profiles, but they do not include electronic terms explicitly. Like ab initio simulations, empirical atomistic models yield a map of all the atoms in a system. With their larger length and time scales, empirical atomistics are particularly useful in examining the dynamics and interactions of atomic features, including defects and solutes. However, because the potential functions ignore nonclassical and electronic effects, the accuracy of the models is necessarily limited.

The most commonly used empirical potential model for metallic materials is the embedded atom method (EAM). Several EAM research codes are publicly available.27 Similar empirical codes are used for polymeric materials.28

Both the strengths (fundamental science basis) and weaknesses (limited system size, timescale, and accuracy) of atomic-scale models are well understood. Developing new techniques to increase simulation size and duration, possibly via coupling with the mesoscale, and exploring new scientific formulations to increase accuracy are the challenges that industry must overcome to widen the use of these methods in design and manufacturing.

27

Nuclear Energy Agency, "DYNAMO, Structure and Dynamics of Metallic System by Embedded Atom Method," March 25, 2002, available at: http://www.nea.fr/abs/html/ests0788.html, accessed April 2004; and S.J. Plimpton, "Fast Parallel Algorithms for Short-Range Molecular Dynamics," J. Comput. Phys., Vol. 117, pp. 1-19, 1995, available at: http://www.cs.sandia.gov/~sjplimp/codes.html, accessed April 2004.

28

S.J. Plimpton, R. Pollock, and M. Stevens, "Particle-Mesh Ewald and rRESPA for Parallel Molecular Dynamics Simulations," in Proceedings of the Eighth SIAM Conference on Parallel Processing for Scientific Computing, Minneapolis, Minn., March 1997. Available at: http://www.cs.sandia.gov/~sjplimp/lammps.html. Accessed April 2004.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

Mesoscale Models. The mesoscale—encompassing length scales between the atomic and the continuum—is the traditional and more application-oriented purview of the materials scientist. However, it is at these length scales that computational tools are most rudimentary.

Because microstructural development is critically important to materials processing and properties, mesoscale models at the polycrystalline grain scale (usually 1 to 1,000 μm) have been the subjects of extensive industrial and research efforts over the past decade or so. A plethora of models have been developed, including dislocation dynamics models for plastic deformation, grain-scale evolution models for annealing, polycrystal plasticity deformation simulations, deposition models, solidification models, and many others. Almost every mesoscale-mediated process or property has been the subject of a computational model. Some common denominator phenomena have been the subject of multiple, competing models; for example, no fewer than five different well-established models describe grain growth in polycrystalline metals.

Mesoscale models are based on combinations of fundamental physics and empirical data, with different models occupying different parts of the spectrum. However, nearly without exception, current mesoscale models are research codes. While they may be more or less available, usable, and supported, they all require specialized scientific and computational knowledge, well beyond the undergraduate materials science curriculum. Moreover, most of these tools have not been validated for realistic, complex materials systems; their accuracy and applicability remain unknown. Because of this, mesoscale models remain a research opportunity but are little used in the manufacturing design process.

Continuum Models. During the design process, the behavior of a material in a component is often the most important unknown factor, impacting both design and manufacturing decisions. The designer needs to know whether a material can be formed into a particular shape, and how that material will perform after forming. Continuum material response models, usually based on an empirical constitutive material model implemented in a finite element solver, are widely used to simulate the mechanical response of materials in arbitrary geometries and environments.

Unlike smaller length scale models, FEM-based solvers are commercially available. Abaqus,29 NASTRAN,30 DYNA3D,31 and ANSYS32 are four examples of widely utilized FEM solvers. Not only do commercial applications offer extensive user support, but also because of the widespread use of these models, undergraduate engineering curricula often include FEM modeling units. These methods are a near-universal component in the engineering design process at large and small companies alike, and are an important component of CAE.

FEM solvers require a model for material response; this is usually a set of constitutive equations. These equations take a form that may be motivated by the underlying physics of the response being modeled, or that may simply be a curve fit to experimental data. In either case, the numerical quantities in the equations are determined by fitting them with experimental data. Depending on the response being modeled, the data in question may be a simple stress–strain curve, or may be the results of a complex set of time- and rate-dependent experiments. While constitutive models for simple materials are often included in solver packages, the design process frequently requires more complex and realistic models, many of which are available in the open literature. More problematic is the data required to fit the model for a particular material and process. Some material response data are available in the open literature, although finding and validating them can be challenging; other data may be available but undocumented; still

29

ABAQUS, Inc., Pawtucket, Rhode Island. Available at: http://www.hks.com/. Accessed April 2004.

30

MSC Software, Santa Ana, California. Available at: http://www.mscsoftware.com/. Accessed April 2004.

31

Lawrence Livermore National Laboratory, Livermore, California. Available at: http://www.llnl.gov/eng/mdg/Codes/DYNA3D/body_dyna3d.html. Accessed April 2004.

32

ANSYS, Inc., Canonsburg, Pennsylvania. Available at: http://www.ansys.com/. Accessed April 2004.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

other data may be proprietary. The need for easily accessible and accurate material data, which is the greatest limitation of continuum modeling for design, is discussed in more detail below.

Thermodynamic Models. The compilation and application of phase diagram data has been a traditional responsibility of the materials engineer. Undergraduate students learn that because thermodynamics constrains phase diagrams, one need not sample every point in composition space to determine the entire diagram. In fact, thermodynamic models can calculate accurate phase diagrams from a few select data points.

Commercial thermodynamic models such as Thermo-Calc33 are in routine use by alloy designers as well as process engineers. These models permit the engineer to determine phase information for simple alloys, to explore new compositions, or to tailor a process for a particular heat. For complex alloys, however, the models cannot predict the existence, composition, and/or structure of the constituting phases a priori. Indeed, without experimental verification, these predictions are highly questionable. Even for simple alloy systems, the models depend on the availability of thermodynamic properties of the alloy components, and the interaction between them. Ideally, it is preferable to acquire such information from experimentally determined and verified databases. Otherwise, the models depend on the use of the available phase diagrams.

Kinetic models, such as DICTRA,34 apply chemical kinetic models to such thermodynamic information in order to determine phase transformation rates for process control. Here again, however, the models are inadequate in predicating complex situations where various processes may take place simultaneously. Box 3-3 illustrates how such models and sensor measurements are used in commercial gas carburizing heat treatment processes.

While both of these (admittedly limited) models are mature applications, with a wide user base and education support, they are data-limited. To produce accurate results, both models require accurate input in sufficient quantities. During the 1960s and 1970s, when U.S. alloys dominated the metals markets, industry and government together funded broad programs to generate thermodynamic data for steel, aluminum, and specialty metals. However, those efforts ended three decades ago.

Now, because performing the experiments to generate thermodynamic data is considered a mature technology, there is little public funding available to do it, so it is performed in-house. Moreover, despite the need to certify new materials, there is little thermodynamic data for new alloy systems. Both absent and inaccessible data, discussed below, are barriers to the widespread use of thermodynamic models.

Databases. A pervasive theme in any discussion of computational materials science tools is the need for data. Data enable model development, validation, and application. In a real sense, useful, accessible data, not models or computer codes, are the desired end product of materials science research.

Unfortunately, the product is often lost or hidden. Consider three scenarios for some data—a stress–strain curve, for example—generated to support a design program:

  • Ideally, the stress–strain data, along with its supporting information on alloy composition and testing methodology is stored in a central database. In the current system, this database is likely proprietary, so while this scenario gives a good chance for data survival and usefulness, it is at the cost of external accessibility.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

Box 3-3
Gas Carburization Heat Treatments in Industry

The prehistoric manufacturing process of carburizing is one of several economical forms of surface hardening. The process is applied to a wide variety of parts where very high levels of surface and near-surface strength, crushing resistance, toughness, and wear are required in combination with significantly lesser requirements for the part's center or core, therefore not justifying the added material costs associated with through-hardening. Gears, bearings, latches, hammers, tools, and spindles are but a few of the many thousands of parts that are carburized daily.

The typical gas carburizing process involves the solid-state diffusion of carbon through the surface into a low-carbon steel part that is subsequently quenched and tempered to result in a stronger, tougher, and more wear-resistant part. Today's most common versions use one or more of several carbon source gases (e.g., natural gas, methane, propane) mixed with one or more of several neutral, carrier gases (e.g., carbon monoxide, carbon dioxide, nitrogen, argon) at a moderately high temperature (approximately 900 to 950°C). Dozens of measurements, sensors, models, and predictions are utilized throughout the manufacturing cycle.

Starting with an initial part geometry and anticipated field/customer duty cycle, the designer models the expected load history onto the part geometry to determine the required performance properties at each location within the part. Other models are used to predict the strength and/or fatigue performance based on the measurable attribute of hardness. The final hardness profile will be a function of the carbon profile, alloy content, cooling rate after carburizing, and tempering temperature. The carbon profile developed during heat treatment will be a function of the initial carbon of the steel, the carburizing gas mixture, surface conditions, carbon diffusion rate, and the time in the furnace. Since neither the carbon diffusion rate nor the carbon profile can be measured nondestructively or in real time, sensors and their associated models are used to predict what is actually occurring during the carburizing cycle, and statistical, destructive sampling is used only to confirm the predictions after the process has been completed.

The area of largest concern during the carburizing process is the prediction, at each instant in time, of the rate of introduction of the carbon into the steel surface and the rate of diffusion of that carbon into the part. Therein lies the opportunity for improvements. For decades, the determination of "dew point"—primarily a function of moisture, CO/CO2, temperature, and pressure—was used to determine the carburizing potential of the furnace atmosphere. More recently, oxygen sensors have been utilized with measurable improvements in process control and yield. Improvements in process control through better sensors permit increases in furnace temperature, yielding substantial reductions in cycle time and cost. Current research is seeking to utilize laser and other advanced sensing techniques to measure the carbon potential of the atmosphere even more accurately. With typical 4- to 8-hour or longer carburizing cycle times, even minor reductions in cycle time become significant cost reductions while concurrently the improved process control yields quality improvements, such as controlled grain growth.

In a large-scale captive heat treatment shop, use of a process optimization scheme for gas carburizing decreases batch time and increases throughput. This higher efficiency permits smaller capital outlay for furnaces; one anonymous shop saw a 20 percent decrease in furnace acquisitions, yielding a capital cost savings of $2 million, with commensurate operational cost decreases.

  • Another possibility is publishing the data in the open literature. Because stress–strain data are usually presented in graphical form in scientific papers, the numerical values of this data will not be preserved, and without the standard data format enforced by a database, some of the supporting information will not be reported. In addition, finding the data will depend on literature search engines not tailored to data, but to concepts. While publication assures the data will survive and be (theoretically) accessible, the practical utility of the data may suffer.

  • In the most likely scenario, the data are recorded in a lab notebook or on a desktop

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

computer. They are used for the current investigation, and perhaps referred back to while remembered, but ultimately, with staff or computer system changes, they are forgotten. In this case, the data are not accessible, useful, or enduring, and the stress– strain experiments will have to be repeated each time this alloy is considered.

Examining the current options, it is easy to define an ideal distributed data storage system. It would be a database format structure that permits storage of entire data sets as well as supporting information. It would be publicly accessible, with a tailored search capability. Finally, it would be stored in a format immune to hardware or software changes.

It is important that any materials property database be verified and validated through a peer review process. The database also needs to be routinely updated to include additional data for new materials or processes, and to replace the existing ones with new and more accurate data. Obviously, any modifications to the database will need to go through peer review as well.

Although approximations of this ideal exist in some proprietary industry databases, there is no such data repository for all the nonproprietary data generated in industry, universities, and government laboratories. The creation of such a database could trim redundant experimental efforts to decrease cost and increase productivity.35 This is a particularly compelling argument for defense acquisitions.36 Because the DoD contracts its design and manufacturing work, it often pays for data acquisition to support the design process. However, there are limitations to transferring those data among DoD contractors, so the DoD generally pays each contractor to generate its own set of design data.

In addition to eliminating redundancy, a materials property database provides a mechanism to collect multiple data sets for statistical analysis. Over time, the accumulation of data on common materials can improve data quality and validity; in essence, the database provides a mechanism to vet new data.

Finally, it should be noted that databases themselves can function as material models when combined with expert system software. For example, expert systems for casting37 link with a database of casting data generated by experiment and modeling to optimize casting parameters for specialty alloys. Such data-driven expert systems are used in a wide variety of industrial process control models. Box 3-4 illustrates how such models have been further developed and used for aluminum castings in the automotive industry.

Roles for Computational Tools

The opportunities for using computational materials tools in bridging the gap between design and manufacturing fall broadly into three categories: the design of new materials, the selection of existing materials, and the processing and manufacturing of all materials. In each of these categories, we make a recommendation for the most critical research needs.

Materials Design

From jet turbine blades to stealth airfoils, new materials enable new products. However, the insertion time for a new material is measured in years—up to a decade for high-reliability applications—while the product realization cycle is measured in months. The lengthy material

35

National Research Council, Materials Technologies for the Process Industries of the Future. National Academy Press, Washington, D.C., 2000, p. 22.

36

National Research Council, Materials Research to Meet 21st-Century Defense Needs. The National Academies Press, Washington, D.C., 2003, pp. 21-25.

37

Walsh Automation, Princeton, New Jersey. Available at: http://www.walshautomation.com/anglais/metals/1_3_4_1.htm. Accessed April 2004.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

insertion time is governed primarily by the time necessary to perform the experimental tests required for certification.

If computational simulations replaced some tests, the material insertion time could be greatly diminished. This is, in fact, the theme of several active, government-sponsored research programs, most notably the DARPA Accelerated Insertion of Materials (AIM) program.38 The potential cost and time-saving benefits of utilizing physics-based models in materials design are well documented.39

The goal of an accelerated insertion program is to minimize (but certainly not eliminate) experimental trials. The materials models most suited to design and analysis of new materials are those that contain fundamental physics, although they often lack applicability to real materials and products. Empirical models are more reliable at the product level, but cannot be extended to other material classes or to predict behavior beyond certain bounds. To overcome these shortcomings, a combination of models is called for. For example, electronic structure calculations, supported by larger atomistic simulations, can provide phase diagrams and some property information to steer designers toward promising alloys. Mesoscale models can suggest target microstructures and processing routes. Armed with data at smaller length scales, materials scientists can start the conventional continuum design and analysis process.

Finally, the committee notes that due to the cost of modifying established infrastructures, there is not a widespread call to replace commodity materials despite potential economic and competitive advantages. Thus, the insertion of new materials is important only in a small, though critical, subset of products and should not be the sole focus of physics-based modeling efforts. The vast majority of products, including defense acquisitions, are successfully produced from conventional materials. In these commodity materials, physics-based models at the atomic and microstructural scales can help designers better understand and optimize material properties for improving yield, cost, performance, and reliability.

Materials Selection

Manufacturing of a new component inevitably begins with the selection of a material. Such a selection requires an in-depth knowledge of the behavior and performance of the material at the component level as well as after the material's integration within the system. Equally important in the selection process is the processing technique and manufacturability of the components.

For selection of the proper materials, designers rely heavily on databases cataloging the properties and performance of various materials. Examples include publicly available databases for structural and mechanical properties, thermophysical parameters, and phase diagrams, such as Alloy Finder Electronic DataBook, AMPTIAC, CAMPUS Web View, CES Selector 4.0, CINDAS, Key to Metals, Key to Steel, and MatWeb. (Appendix D provides the Web sites for these databases.) There are also proprietary databases, held mostly by materials manufacturers and suppliers. Most of the publicly available databases contain outdated information and are of limited use to materials designers, particularly for the design of functional and advanced structural materials. For example, among the most recently updated databases, the ASM International binary phase diagrams were compiled after a critical peer review process about 20 years ago. Even for this case, the diagrams for most of the commercial

38

Defense Advanced Research Projects Agency, Arlington, Virginia. Available at: http://www.darpa.mil/dso/thrust/matdev/aim.htm. Accessed April 2004.

39

National Center for Manufacturing Sciences, Ann Arbor, Michigan, available at: http://springback.ncms.org/, accessed April 2004; Defense Advanced Research Projects Agency, Arlington, Virginia, available at: http://www.darpa.mil/dso/thrust/matdev/aim.htm, accessed April 2004; National Research Council, Modeling and Simulation in Manufacturing and Defense Systems Acquisition: Pathways to Success, National Academy Press, Washington, D.C., 2002.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

multicomponent alloys are entirely missing. Other structural, compositional, mechanical, and processing data for materials selection, given mostly in handbooks, also either are outdated or have been compiled without a critical review process.

Structural materials constitute the backbone of all manufacturing processes, from plastics to ceramics and metals. Detailed databases are required for predicting the behavior of these materials, and consequently their performance, during manufacturing. Such information will enable materials designers to improve processing and manufacturing methods. Such a tailored approach would enable more affordable products as well as providing for unique combinations of structural and functional capabilities. Detailed databases would also enable organizations to use modeling to better effect for manufacturing processes.

Processing and Manufacturing

All manufacturing processes begin with raw materials that are converted into semi-finished products, and then various forming and finishing operations turn those products into manufactured components. Materials science informs design and manufacturing engineers how the materials they select will perform during processing into products and, later, over the product's lifetime.

Component design begins with a set of requirements and constraints, such as the required load-bearing capability, weight limits, and space constraints. After selecting materials for the product, the designer uses a CAD system to produce a geometric model of the part, which is then analyzed, using material properties obtained from handbooks and databases of the kinds discussed in the preceding sections. The analysis typically entails both product and process simulations at the continuum scale. In current practice, such analyses are often done separately and performed by different analysts, and there can be duplication of effort. Most such analyses require a computed model for use in the CAD system, usually represented by a registered mesh, or grid, that covers the surface of the modeled part. While relatively simple objects can be enmeshed automatically, for complex objects this is a time-consuming and expensive process.

Box 3-4
Virtual Aluminum Castings: Atoms to Engines

Ford Motor Company developed software to use physics-based materials models to link manufacturing, design, and materials. Automotive engines are continually being developed and refined to meet the rapidly changing needs of customers and to deal with competitive and regulatory pressures. Two of the most important components in any engine are the cylinder head and the block. The development of these large and complex aluminum castings is often the rate-limiting step in any engine development program.

Baseline

These components are generally designed using empirical databases which assume that material properties are not affected by the details of the manufacturing process and are constant throughout the component. The reality is that the key properties are highly dependent on the location within the casting. Currently analytical techniques for design (e.g., durability analysis) and manufacturing are completely unconnected and are conducted by analysts in different organizations. Castability constraints are input early in the design process by manufacturing engineers using engineering "rules of thumb" that are imperfect and are imperfectly applied. Analytical software for assessing castability is typically run late in the component development process and often in response to problems that are encountered in manufacturing.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

This situation results in costly iterations in dies and part geometry, late changes, and program delays. Engine failures that occur during engine testing due to imperfect analysis can lead to major and costly setbacks in program timing. Opportunities for optimizing a component, such as by reducing its weight or optimizating casting or heat treatment cycle time, are missed.

Accomplishments

Ford Research Laboratories has developed a comprehensive and integrated suite of computer-aided engineering tools, called Virtual Aluminum Castings (see figure), for use by company CAE analysts. There are three key aspects of this development:

  • The computational materials models for predicting mechanical properties were physics-based and involved linking materials models that account for metallurgical phenomena occurring at vastly different length and time scales.

  • The tools were developed, were substantially augmented, and acted as links between commercial software used for casting analysis and durability analysis.

  • To use these tools in an effective manner, an organizational culture shift was required.

The physics-based computational materials models had to be able to accurately extrapolate and interpolate existing empirical understandings of mechanical properties while also being simple to use and computationally efficient. To accomplish this goal a wide range of materials modeling tools were used and the results linked and embedded in easy-to-use subroutines. Modeling approaches that were used included ab initio calculation of interatomic potentials and free energies, thermodynamic phase equilibria calculations for phase stability and segregation, microstructural evolution models involving diffusion and phase morphology, and micromechanical models for calculation of properties from a variety of mechanisms, such as precipitation hardening, solid solution hardening, and fatigue crack propagation. Development of these models required establishment of a unique mix of research expertise including experimentalists, theoreticians and numerical modeling experts, metallurgists, physics researchers, and mechanical engineers.

Ford-proprietary subroutines and linkage programs were developed because these were not available elsewhere and there appeared to be insufficient return on investment for current software vendors to develop the empirical and theoretical knowledge required to accomplish this goal. The development of the Virtual Aluminum Castings suite of tools was enabled by the existence of commercial software, such as Abaqus and ProCast, with open architecture allowing user-defined subroutines. This development was considerably more difficult in other areas due to commercial codes that did not have this feature (e.g., MagmaSoft).

Benefits

These tools have recently been developed and are in the process of being implemented at Ford CAE groups around the world. Using these tools, Ford has experienced an improved ability to launch new cast aluminum products with a minimum of iteration and tooling. This has allowed a decrease in the amount of durability testing required and a decrease in program development timing. Ford has embedded significant knowledge of cast aluminum metallurgy into these models and is able to provide it to CAE analysts worldwide on a consistent and common basis. These tools have only recently been implemented within Ford so their benefits cannot be quantified. However, Ford estimates that they have the potential to reduce costs of $45 million to $70 million on an approximately $500 million annual expenditure for new cast aluminum components.

Lessons Learned

  • Linking manufacturing and design via physics-based materials models to allow two-way analytical feedback between manufacturing, materials, and design has significant economic benefits both for industry, in the form of reduced development costs and faster timing, and for society, in the form of more efficient products and reduced waste.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

  • A commercial framework and software with open architecture are needed to facilitate the future development of such models.

  • Sufficient fundamental knowledge exists for selected mature metallic systems and selected properties so that with focused effort and judicious selection of experiments, hierarchical, physics-based materials modeling across length and time scales can be accomplished expeditiously and with sufficient accuracy to be useful to industry.

  • It is difficult and rare for any company to amass the expertise required to fully bridge design and manufacturing with materials modeling. This suggests that government coordination of a larger effort, for a wider range of materials and processes, would accelerate the development of such models, minimize the cost by reducing redundancy, and yield substantial benefits to society and U.S. industries.

Ford Research Laboratories' comprehensive and integrated suite of computer-aided engineering tools, called Virtual Aluminum Castings.

Further, the product performance and process analyses may use different geometric models. For example, forming processes such as casting, forging, and sheet metal forming produce components using dies or molds that differ in shape from the final product to accommodate such effects as solidification shrinkage, elastic springback, and trim allowances. Generating both an interim and final geometric model is also a significant bottleneck in the overall product design process. In addition, design changes introduced by one analyst to improve the product must be updated and transmitted to the other codes. There is rarely sufficient interoperability between the various analysis codes and the CAD program to transmit the design changes effectively.

Once the computer model of the part is created, the component is analyzed—generally

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

using FEM analysis—to determine its behavior in service and its response during processing. Supports and loads are applied. Material properties are then entered, and product performance is simulated. The material properties take the form of constitutive relations, and as discussed above, these are critical to the fidelity of the simulations.

In addition to incorporating material properties, the simulations apply external loads and interactions in the form of boundary conditions. Often these boundary conditions represent interactions between the component and its environment that are not well understood. Examples include friction and thermal transport, both of which depend on the nature of the contacting surfaces and on local thermal and pressure conditions that may change during the process. Experience and databases are helpful for estimating these properties, but true knowledge that is time- and condition-dependent will always be inexact. Models can be used to identify those properties that the model is most sensitive to, and sensors can be deployed to advantage in quantifying the exact conditions when physical prototypes are built.

Once the simulations are performed, the designer can evaluate the results and modify the design to improve it. While there are some examples of fully automated design optimization, this process is usually done by trial and error. Facilitating this optimization represents a real opportunity to improve the design process. In such an approach, the designer may specify an objective such as minimum weight; constraints, such as part topology and maximum stress; and design variables, such as product dimensions. The simulation process would then automatically make changes to optimize the design. This type of analysis might have the further benefit of identifying sensitivities, i.e., those product features that are most important to the objective. These features might then be candidates for sensing and control during processing or service.

One reason that optimization is not used more widely is that individual simulations take too long and cost too much to perform. Depending on the complexity of the analysis, optimization may require 20 or more separate analyses, and designers may be unwilling to wait for the results. It would seem that increases in computing power would eliminate this problem over time. However, the committee has observed that as computing power increases, the detail and the size of the simulations seem to increase in proportion, so that the clock time for performing simulations really has not changed very much over time.

Research is needed to allow optimization techniques to be used effectively for modeling and simulation of manufacturing processes. One approach is to systematically reduce the model complexity in evaluating initial designs, so that the simulations can be done in much less time. Final designs could then be perfected on the full models. Automated tools for reducing model complexity, for example by removing small features, are lacking. Further, tools for linking results from various software platforms with optimization codes are lacking. Directed research in this area could significantly change the way in which design and manufacturing are done. Some contractors, for example Boeing, have privately funded integrated structural analysis and optimization techniques that have been used successfully on the X-32 JSF prototype, the X-45 UCAV, and the F/A-18E/F Super Hornet.

All simulations must be supported by verification and validation.40 Verification of codes can be done by providing a suite of test problems with known solutions; validation involves comparison of the results of a simulation to the results of experiments to determine the quality of the data that is input to the models, such as material constitutive models, loads, and other boundary conditions.

40

"Verification is the process of determining that a model implementation, or simulation, accurately represents the developers' conceptual description and specifications. Validation is the process of determining the degree to which a model and associated data are an accurate representation of the real world, with respect to the model's intended use." From National Research Council, Modeling and Simulation in Manufacturing and Defense Systems Acquisition: Pathways to Success, National Academy Press, Washington, D.C., 2002, p. 95.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

MANUFACTURING TOOLS

Manufacturing comprises a wide range of processes, with different emphases in different industries. This section lists those tools that appear generic across industries that make discrete parts products.41 The tools support several kinds of processes falling into two categories: microscale and macroscale. The microscale category includes individual processes and process steps. The macroscale category comprises the system aspects of designing and operating a manufacturing enterprise.

The following is a list of some of the most common tasks and necessary tools:

  • Process planning, including identifying the necessary steps and equipment, and their sequence for part fabrication, for assembly, and for test and inspection

  • Process simulation, analyzing capability, cost, time, and yield, e.g., injection molding, casting, machining, and assembly

  • Logistics planning, planning factory layout, e.g., equipment location, storage, and material flows

  • Factory flow simulation, determining location of bottlenecks and total yield

  • Ergonomics analysis for worker safety and effectiveness

  • Robotics and material-handling simulations, timing, tooling and workstation layouts, and cost

  • Production management of materials requirements planning, manufacturing resource planning, and scheduling

  • Economic analysis and justification

  • Quality measures, including statistical process control (SPC), six sigma, process capability measurement and analysis

Box 3-5 shows an example of the benefits of using manufacturing software tools. Progress in manufacturing tools has been continuous for decades, driven by advances in basic knowledge about processes and digital representations of this knowledge as well as by improvements in computational power. Commercial industry has taken over a great deal of the effort of converting new knowledge into computer tools. Many of these tools stand alone, but several commercial vendors have linked them together. Interoperability of tools offered by different vendors remains a problem, however, and incentives to make the tools interoperate are few in comparison to the problem of determining which representation to choose.42 In some

41

R. Brown, Delmia Corp., "Digital Manufacturing Tools," presented to the Committee on Bridging Design and Manufacturing, National Research Council, Washington, D.C., April 29, 2003.

42

C. Hoffmann, Purdue University, CAD Tools Evolution and Compatibility, presented to the Committee on Bridging Design and Manufacturing, National Research Council, Woods Hole, Mass., August 25, 2003.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

Box 3-5
Case Study in Manufacturing Software Tools

Baseline

Aerospace parts were programmed for milling by conventional manual programming.

Accomplishments

An expert system was developed to partially automate the programming process. CAD solid models were processed by the new software and geometric features were recognized and noted. Then standard processes for producing those features were found in a feature/process table. Finally, each standard process was expressed in terms of the workpiece size, orientation and position, and machine tool moves were calculated.

The resulting computer numerical control (CNC) program output shown here was scanned by a programmer and manually edited where necessary.

Reprinted by permission from BobCAD-CAM; available at http://www.bobcadcam.com/.

Note that inclusion as an example is not intended to be an endorsement of any particular product.

Benefits

Benefits included a significant reduction in manual CNC programming time. An average of 83 percent reduction in manual programming time was achieved in a test of five typical aluminum aircraft parts. The total product production process time was reduced by more than one week on average.

Lessons Learned

It is possible to automate much of the manual CNC programming process by using expert system software and feature recognition in connection with a solid model of the part to be machined. Programming time data from the five-part text indicated that conventional processing that took between 45 and 120 hours could be reduced to between 4 and 24 hours using HiThru. a

a

P. Zelinski, "Empowering the Programmer," Modern Machine Shop Magazine, April, 2002.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

domains, fundamental knowledge is still lacking.

Research is needed to develop efficient scheduling algorithms that permit rapid redirection of resources to meet changing demands or circumstances; models of assembly tasks performed by people to understand fatigue, errors, and injuries; design of factories to permit flexibility and efficient redeployment of large investments; and generalizations of existing design-for-X (DfX) methods and rule bases to include such concerns as product quality, recycling, and environmental friendliness.

Bridging of Design and Manufacturing

The bridging of design and manufacturing is a core issue for this study. It is not represented by an explicit step in conventional definitions of product development, except perhaps in design-for-X (DfX) processes. Instead, ideally, it is a pervasive activity that should occur continuously throughout product development. The committee notes that each identified activity is carried out now, but the degree to which tools exist in automated form varies greatly. Moreover, the lack of understanding of cultural and managerial barriers may be more important than lack of computer tools.

Technical Coordination of Specifications and Procedures

Some of these coordination activities include the following:

  • Identification of critical resources such as suppliers, factories, long lead items, and employees' skills needed to manufacture a given design

  • Identification of design and materials alternatives or process alternatives needed to manufacture a given design, together with ways of finding the best combination

  • Determination of the structure of product families and architectures to coordinate with layout, equipment, and organization of the factory to permit flexibility and efficient redeployment of assets to meet changing requirements

  • Alignment of materials properties specifications and production outputs, tolerances on parts and resulting variation, and tolerances on assemblies and resulting variation

  • Collection and utilization of lessons learned during product launch

  • Collection and utilization of lessons learned during use of the product

There is a race between advancing knowledge and rising expectations regarding product quality and performance. As customers' expectations rise, tolerances that used to be sufficient are now no longer acceptable. Competition drives all players to be as good as the leaders. Better understanding of materials and processing methods will grow incrementally, as it has in the past. Breakthroughs in conventional materials are unlikely, and adoption of new materials in existing industries is notoriously slow. Stand-alone computer tools will emerge when knowledge becomes stable. Adoption of such tools will depend on ease of learning and use and on the relevance of what the tool does and what the company requires its employees to do. Organizational barriers and incorrectly structured incentives may inhibit the adoption of tools and methods.

Many companies outsource the design and construction of manufacturing equipment and systems, often because manufacturers no longer have the capability. But capability was lost when the decision was made to outsource to begin with. Outsourcing of tightly coupled activities

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

adds complexity to an already complex bridging process.43 This trend is expected to continue, although some companies are trying to regain lost capabilities.

Organizational and Managerial Arrangements, and Enablers of Bridging

There are a number of organizational and managerial arrangements that need to be considered:

  • Methods of analyzing design processes to identify inefficiencies such as missing information and unnecessary repetitions of process steps

  • Economic models of cost versus value in revising production processes and investments to improve the product

  • Skills and training in negotiation

  • Definition of roles and responsibilities that bridge conventional organizational boundaries between design and manufacturing

In this domain there is continual flux and reconceptualization of objectives and methods. Lean manufacturing and agile manufacturing are two suggested ways of organizing manufacturing enterprises for the best delivery of value. Within lean manufacturing, one suggested concept is value stream mapping. This technique aims to identify all the steps and actors in creation of an object or a piece of information in order to find inefficiencies and eliminate waste. This is a domain in which enlightened practitioners often lead researchers, who serve to observe, report, and systematize what they observe in order to improve the performance of other companies.

Here, again, there is a race between capabilities and expectations. In particular, companies want to develop products faster. In the car industry, it typically takes about 4 years to go from concept to production. One of the longest steps in car development is evaluation of the manufacturing feasibility of the design, which sometimes takes 2 years. Great improvements in streamlining this process have been achieved by introducing computational tools. At the same time, new requirements for safety, durability, and appearance have made the task harder. Also, much if not most of the value of cars and aircraft is outsourced, requiring coordination of design and manufacturing across many organizational boundaries. This extra set of transactions increases the complexity. The result is that the process is not significantly faster than it was 10 years ago.

However, some aircraft companies using the results of the USAF/MIT Lean Aerospace Initiative with integrated analysis tools have now designed and developed new prototype aircraft such as the X-32, X-45, and X-47 in approximately half the schedule and half the cost of traditional methods. Also, some car companies can bring out new versions of existing cars in as few as 2 years. The reasons appear to be a combination of more astute use of computational tools plus managerial techniques such as coordination of tasks, reuse of existing designs and factories, smart supply chain management, and incentives for design and manufacturing engineers to work more closely together.44 As another example, the new Boeing 7E7 commercial transport is planned to be in final assembly for only 3 days, reflecting the culmination of the lean enterprise transformation of lean engineering, lean supply chain, and

43

Charles Fine and Daniel Whitney, "Is the Make-Buy Decision a Core Competence?" Moreno Muffatto and Kulwant Pawar (eds.), Logistics in the Information Age, Servizi Grafici Editoriali, Padova, Italy, 1999, pp. 31-63.

44

Durward K. Sobek II, "Principles That Shape Product Development Systems: A Toyota-Chrysler Comparison," PhD Thesis, University of Michigan, 1997; J.M. Morgan, "High Performance Product Development: A Systems Approach to a Lean Product Development Process," PhD Thesis, University of Michigan, 2002.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

lean manufacturing.

Improvement in design–manufacturing coordination involves both technical and managerial/organizational actions. Creation of new technical knowledge in this domain will not be sufficient without accompanying improvements in management methods and organizational arrangements. These include how to structure cross-functional teams, how to flow information in a timely manner between team members, how to identify and resolve conflicts and discrepancies, and so on. These are ongoing research topics in business schools and some engineering schools. These activities should be encouraged. Research on outsourcing of key activities to determine how to minimize complexity and maximize coordination is also needed, along with better economic models of outsourcing choices that reflect the strategic impacts on companies and industries. Loss of national capability also needs to be assessed.

LIFE-CYCLE ASSESSMENT TOOLS

This section deals with the evaluation of the environmental impact of a product over its life cycle from concept to disposal and not with life-cycle design or life-cycle analysis, which is a broader topic that encompasses life-cycle costing, design for reliability, design for maintainability, and life-cycle analysis. Some aspects of life-cycle design are addressed in the earlier sections on systems engineering tools and engineering design tools.

Increasingly stringent environmental regulations are inducing a more holistic approach to environmental problems, shifting the focus from end-of-pipe pollution control to transforming industry to act as ecosystems with closed loops between wastes and resources. This mass balance approach to environmental problems pioneered by Ayers and Kneese45 and later called industrial ecology (IE) by Frosch and Gallopoulos46 is attracting considerable interest within the engineering and scientific community. With population pressures, congestion, resource depletion, and other indicators suggesting limitations on the assimilative capacity of the biosphere, source reduction, recycling, and other strategies to reduce waste generation (including emissions) and resource consumption are gaining greater attention.

To devise and implement these strategies, firms must view their environmental impacts in a broad context from input supply and production through product distribution, use, and disposal. For instance, recent efforts by automobile companies to redesign internal combustion engines and to develop hydrogen fuel cell power vehicles are motivated in part by increasingly stringent air emission standards both here and abroad. The net environmental benefits of these technologies over existing transportation systems should be measured broadly to include environmental impacts during energy resource extraction, fuels processing, vehicle utilization,

45

R.U. Ayres and A.V. Kneese, "Production, Consumption, and Externalities," American Economic Review, Vol. 59, No. 3, pp. 282-297, 1969.

46

R.A. Frosch and N.E. Gallopoulos, "Strategies for Manufacturing," Scientific American, September, pp. 144-152, 1989.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

and product disposal and recovery.

There are several emerging methodologies for tracking environmental impacts within industrial systems. One approach is to quantify mass flows from source to sink for a material, element, chemical compound, or finished product at a point in time for a specific region, which is known as substance or mass flow analysis (MFA). For example, Socolow and Thomas47 examined flows of lead in the U.S. economy arguing that large-scale use of lead in electric cars should not be precluded because a nearly closed recycling system for lead–acid batteries that exists implies minimal health risks from lead exposure.

Another common IE tool is life-cycle assessment (LCA), which the Society of Environmental Toxicology and Chemistry (SETAC) recommends to address the environmental implications of products and processes.48 SETAC views LCA as an objective process to evaluate the environmental burdens associated with a product, process, or activity.

Mass balance analysis is employed in the inventory analysis component of an LCA, involving inventories of energy, materials, and wastes in raw material preparation, manufacturing, use, and disposal. For example, an LCA of an automobile involves estimating the resource use and effluents generated in the production of steel, glass, rubber, and other material components of the car. To this are added the resource and emissions inventory of the automobile assembly plant. The focus then shifts to consumers, how much fuel they consume and the emissions generated during the use of a car—that is, to quantify resources consumed and emissions generated in both fuel production and use, and the burdens (consumptions and emissions) associated with vehicle maintenance and repair such as new parts and oil changes. The final phase of the inventory examines the disposal of the vehicle, estimating what proportion is recycled and the composition of that flow.

The inventory provides a more or less quantitative overview of the material and energy flows incurred in the product life cycle. Impact assessment, which follows, attempts to quantify potential impacts of the inventory on environmental and human health through metrics in a series of categories. Aggregating these impacts into metrics for decision making is perhaps one of the greatest challenges facing LCA.49 From this point, decision makers, such as product design and development teams, identify strategies to improve environmental performance.

Development and implementation of these strategies involve a set of activities known in the industrial ecology community as design for the environment (DfE), for which Allenby50 identifies two general categories. The first includes generic efforts, such as green accounting systems and environmentally sensitive procurement policies. The second includes technological development, such as computerized DfE design tools integrated with automated design and manufacturing software.

Life-Cycle Assessment

The Society of Environmental Toxicology and Chemistry (SETAC)51 defines LCA as follows:

47

R. Socolow and V. Thomas, "The Industrial Ecology of Lead and Electric Vehicles," Journal of Industrial Ecology, Vol. 1, No. 1, pp. 13-36, 1997.

48

Society of Environmental Toxicology and Chemistry, A Technical Framework for Life-Cycle Assessment, SETAC, Washington, D.C., 1991.

49

Committee on Material Flows Accounting of Natural Resources, Products, and Residuals, National Research Council, Materials Count: The Case for Material Flows Analysis. The National Academies Press, Washington, D.C., 2002.

50

B.R. Allenby, Industrial Ecology: Policy Framework and Implementation, Prentice-Hall, Upper Saddle River, N.J., 1999, pp. 69-95.

51

Society of Environmental Toxicology and Chemistry, A Technical Framework for Life-Cycle Assessment, SETAC, Washington, D.C., 1991, p. 1.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

This definition suggests four steps in LCA: goal and scope definition, resource and emissions inventory, impact assessment, and improvement analysis. The ideal is an objective assessment of the environmental implications of a well-defined production process and the identification of opportunities to improve environmental performance. Owens,52 however, argues that it is impossible to be entirely objective because most LCAs involve simplifying assumptions and subjective judgments.

As the definition above states, LCA attempts to provide an objective assessment of the environmental implications of a well-defined production process and identifies opportunities to improve environmental performance. LCA studies should clearly define their goals. These goals may serve to improve environmental management and product design within the firm. Other goals include strategic concerns, such as demonstrating that a product or process has environmental attributes that exceed the competition. The definition of "cradle and grave" often varies depending upon the goal and scope of the analysis. In many cases, the cradle includes mining and raw material extraction while the grave is at the plant gate.

One of the key steps in LCA is the emissions inventory. For many industrial processes, detailed inventory data by process are unavailable. In this case, some studies infer the data based upon input–output or mass balance relationships. A considerable number of subjective judgments enter this stage of the assessment, often buried in the details of the data compilation. In some cases, emissions judged to have no impact are not included in the inventory. This practice is misleading, although a mass balance could effectively uncover these emissions, depending of course on their magnitude. Conducting a mass balance requires engineering expertise and a detailed knowledge of the production process.

Once the inventory is complete, the next step of LCA is impact assessment. Many industrial processes generate an array of air, water, and solid-waste emissions. Each of these categories in turn generates an array of impacts. Many studies classify these impacts into several categories, such as human health, air visibility, acid deposition, global climate change, and other impact categories. Since one pollutant can contribute to more than one impact category, LCA studies often develop metrics to measure the effect that each emission has on each impact category.

As envisioned by SETAC, the final step in LCA is an analysis of how to improve the environmental performance of the production process. The analyst has the LCA inventory and estimates of the impacts across several different impact categories. LCA by itself, however, does not provide a framework for improvement analysis. The necessary decision framework should have two features. First, it should consider the cost of existing and alternative production technologies. In most situations, firms will not adopt environmentally beneficial technologies unless they generate significant cost savings over current technology.53 The second feature of

52

J.W. Owens, "Life-Cycle Assessment: Constraints on Moving from Inventory to Impact Assessment," Journal of Industrial Ecology, Vol. 1, No. 1, pp. 37-50, 1997.

53

It should be noted that the aerospace industry in Southern California, where the regulations of the South Coast Air Quality Management District are perhaps the most stringent in the world, has been very innovative in applying advanced manufacturing technology to develop such technologies as alternative paints, corrosion protection coatings, and adhesives to meet these very stringent requirements and at the same time save weight and reduce unit aircraft cost. For example, Boeing is now using new topcoat paint for the C-17 that has a lower volatile organic compound discharge rate and is lighter, more durable, and less expensive than traditional paints.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

the decision framework is that it should include some method to aggregate environmental impacts, which impose indirect costs on the firm, perhaps by inducing onerous regulation, and costs on society. A broader definition of cost would include typical operating and capital costs, costs associated with environmental damage and savings from avoided disposal and regulation. The LCA inventory can provide a good basis to estimate these environmental costs. This last feature requires the development of an environmental metric.

Considerable international effort has been expended to standardize the LCA process. Details of the standards can be found in ISO 14040, 14041, 14042, and 14043. These standards view LCA somewhat differently than SETAC; in fact, the phases of an LCA are goal and scope definition, inventory analysis, impact assessment, and finally interpretation. The ISO process sees LCA as having generally more iteration between the phases and also recommends certain practices pertaining to LCA objectives, public versus private studies, product comparisons, and allocation procedures. However, the basic process and intent of the two methods are the same and use of the SETAC framework here captures, with a little less complication, the essentials of LCA.

The comprehensive nature of LCA is perhaps one of its flaws because the informational requirements can be daunting and expensive to meet. Estimating how these emissions affect human health, global warming, and other environmental problems is even more complex and is fraught with considerable uncertainty. Even with quantification of these uncertainties, impact assessment does not provide policy makers with a clear ranking of alternatives. This task requires an environmental metric that weights various environmental impacts.

Several options exist for developing such metrics. One approach used by environmental economists essentially places a dollar value on impacts. This approach quantifies damage from environmental impacts so that the benefits of pollution reduction are the minimization or elimination of related monetary damages. Like the estimation of impacts, estimating the value of damage introduces another layer of uncertainty. While estimates of health costs associated with environmentally induced illness are easily documented, other impacts, such as ecosystem preservation and biodiversity, are inherently much more difficult to value. Another class of metrics avoids valuation and instead uses a variety of sustainability indices. The main drawback of these approaches is that arbitrary judgments may creep into the quantification of the sustainable standard.

Regardless of what environmental metric or set of metrics is used, the technology adoption problem remains one of optimal choice under uncertainty. Many researchers have used operation research tools, such as linear programming models, to address these problems. For example, Considine et al. use a linear programming model to identify least-cost production and investment strategies for the steel industry under coke oven emission controls.54 Their model integrates engineering, economic, and environmental information. Other researchers have adopted a similar approach, such as Allen55 in his study of chlorine minimization strategies in the chemical industry.

Design for the Environment

A firm evaluating new technology must balance cost and strategic concerns with environmental performance. Cost depends upon the unit labor, energy, and material efficiency of the process as well as capital intensity. Cost is important because it ultimately relates to product affordability. Strategic aspects include breaking dependence upon suppliers of essential

54

T.J. Considine, G.A. Davis, and D.M. Marakovits, "Technological Change Under Residual Risk Regulation," Environmental and Resource Economics, Vol. 3, pp. 15-33, 1993.

55

D. Chang and D.T. Allen, "Minimizing Chlorine Use: Assessing the Trade-offs Between Cost and Chlorine Reduction in Chemical Manufacturing," Journal of Industrial Ecology, Vol. 1, No. 2, pp. 111-134, 1997.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

intermediate materials, expansion into growth markets, and many other considerations. Increasingly, environmental performance is entering technology development and investment decisions.

Evaluating the cost-effectiveness and environmental implications of different process and strategic choices facing a manufacturing plant can be complex. On the environmental side, the main source of complexity is the sheer volume of data from an LCA inventory. Another complication involves the interrelated nature of manufacturing operations. Changes in one unit process can affect the economics and environmental performance of downstream processes. Operations research (OR) tools provide a framework for organizing this information, identifying trade-offs, and making decisions.

One subset of OR tools are engineering–economic process models. As the name suggests, these models integrate engineering detail about the production process with cost information. Environmental data concerning emissions are essentially engineering information. Building an engineering–economic process model typically begins with a definition of the production process at some chosen level of detail dictated by the availability of data about the process. For instance, at a very high level of aggregation, fiber optic production could be modeled to include slurry preparation, glass ingot production, cable extrusion, and product finishing. The level of aggregation depends upon the purpose of the model. If the intent is to understand the economic and environmental trade-offs of a new process, then all that is needed is to determine how that process would alter existing practice. The model of the existing system and its possible reconfigurations would include new technology as a process option.

The next step in developing the model involves quantifying the input–output (IO) relations at each stage of the production process. Inputs include fuels, materials, supplies, maintenance, water, labor, and other inputs that vary with unit production levels. Outputs include the final product, recoverable byproducts, such as heat, steam, or offgases; and air, water, and solid waste emissions (or nonrecoverable byproducts). LCA emissions inventories taken for regulatory purposes are natural databases for estimating these latter components.

In addition to the IO coefficients, constraints are another characteristic of the production process. One common constraint is capacity. For instance, how many tons of iron can a blast furnace produce in 1 year at average capacity utilization? Other constraints include final product demand, environmental emission standards, and material balances. The material balance constraints ensure that supplies of intermediate inputs are at least as large as the demand for them by downstream unit operations.

The IO relations and constraints are essentially engineering information about the production process. The next layer requires economic information, including prices paid for purchased fuels, materials, and supplies. In addition, the analysis requires hourly wage rates on production workers and salaries for managerial and technical staff. Total operating costs equal the product of input requirements and prices paid for these inputs summed across all operations.

The final step of the analysis requires specifying an objective function. There are several approaches available. First, one could specify a multiobjective function that essentially is a weighted function of operating and capital costs and environmental impacts with weights selected by the decision maker. Choosing the weights, however, often involves subjective judgments. Another approach is to specify an environmental damage function, which is the product of the environmental impacts and the dollar-per-kilogram damages. Under this specification, the objective is to minimize the sum of operating, capital, and environmental damage costs. These components—the process activities, their IO coefficients, the production constraints, and the objective function—constitute the engineering–economic process model.

Life-cycle assessment information enters the definition of the process activities and the estimation of the IO coefficients. These coefficients include the indirect environmental impacts from upstream production activities. For example, purchasing a kilowatt of electricity may cost

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

four cents per kilowatt hour and indirectly contribute to global climate change, ground-level ozone impacts, and acid rain problems. In other words, the electricity-purchasing activity costs money and generates environmental impacts that impose costs on society. Subsequent processing activities involve purchasing factor inputs at market prices, using intermediate products that transfer from one process to another, and consuming common property environmental resources with values approximated by the procedures discussed above.

The solution of an engineering–economic process model finds the mix of production activities that satisfies the objective function. Typically one of the first steps is to solve the model using the firm's current objective; for example, by minimizing operating costs and capital charges, and then comparing the optimal solution with current production levels for each unit process. If the constraints, prices, and IO coefficients are correct, then the solution should closely correspond with current practice. In essence, the process model provides a quantitative description of the production process. Some industries use process models for operation management, such as petroleum refiners who optimize their product mix based upon the cost and quality of their hydrocarbon inputs.

Process models for DfE also provide a framework for examining the economics and environmental impacts of process design options. A modification of an existing process or a new process would become a process option in the model. Given market, capacity, and material balance constraints, the model would determine the least-cost mix of processes across different technologies. The challenge for this stage is to develop reliable estimates of the IO coefficients for the new technologies. Box 3-6 illustrates the use of such engineering–economic process models for steel production.56

In some cases, the design process is too complex and detailed to perform an overall system optimization. In these cases, process models can be used to optimize system components and less formal methods can be used to arrive at a final design that combines the environment, performance, and cost considerations. Product designers would then apply their own subjective weighting for these criteria, iterating toward a final design.

Tool Development Needs

Despite a mature state of development, life-cycle assessment is a time consuming and costly process. Moreover, the reliability of the results is unknown because the data and the methodology underlying environmental performance metrics are proprietary and, therefore, not subject to rigorous peer review. Although the International Organization for Standardization (ISO) has passed guidelines for conducting life-cycle assessment studies, two areas in need of further development include standardized peer reviewed databases and metrics development. The former is being addressed in North America by the National LCI Database project managed by Athena International. Some insights into the issues pertaining to metrics development can be found in a 1999 National Research Council report.57

The development of transparent and reproducible environmental performance metrics is clearly a necessary first step in bridging the gap between design for the environment and manufacturing. Engineers need to know the environmental design criteria. Developing one metric that aggregates many different environmental performance indicators is one approach. Another approach is to consider multiple criteria and subjectively balance them. In practice, however, designers need some notion of what the minimum acceptable environmental standards should be. Government standards, for example, attempt to do this by setting

56

T.J. Considine, C. Jablonowski, and D. Considine, "The Environment and New Technology Adoption in the U.S. Steel Industry," final report to National Science Foundation and Lucent Technologies, BES-9727296, May, 2001.

57

National Research Council, Industrial Environmental Performance Metrics: Challenges and Opportunities, Committee on Industrial Environmental Performance Metrics, National Academy Press, Washington, D.C., 1999.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

Box 3-6
Case Study of the Environment and Process Design in Steel Production

Steel mills use one of two types of furnaces to make new steel. Both furnaces recycle old steel into new, but each is used to create different products for varied applications. The first, the basic oxygen furnace, uses about 28 percent steel scrap to make new steel. The other 72 percent is molten iron produced from blast furnaces, which requires iron ore from mines, limestone from quarries, and coke from batteries of ovens. The furnace produces uniform and high-quality flat-rolled steel products used in cans, appliances, and automobiles. The other type of steel-making furnace, the electric arc furnace, melts virtually 100 percent steel scrap to make new steel. Steel minimills using these furnaces now produce nearly 50 percent of total U.S. steel production. This steel is used primarily to make products that have long shapes, such as steel plates, rebars, and structural beams. Steel minimills are far less capital intensive than integrated mills because they do not require blast furnaces and coke ovens. Their reliance on steel scrap also affords them an environmental advantage in lower consumption of energy and virgin material consumption.

Minimills have entered the last domain of integrated steel, employing thin-slab casting that can yield relatively high quality sheet steel. This additional competitive force comes at a time when many integrated steel firms are seriously reevaluating their plants in light of the recent regulations controlling toxic emissions from coke ovens. Most existing methods of producing coke generate fugitive emissions that contain potentially carcinogenic substances, such as benzene soluble organics (BSOs). A variety of strategies, some entailing additional investment and/or higher operating costs, can reduce these emissions. Inland Steel built a large battery of coke ovens using the Thompson nonrecovery process, heralded as a possible clean technology breakthrough. This design allows the controlled burning of coal that destroys the BSOs and other potentially carcinogenic compounds contained in the offgases of the coking process. There are, however, relatively large amounts of sulfur dioxide emissions from the waste heat, which can be recovered via heat exchangers and used to produce steam for electricity generation.

Several other iron- and steel-making technologies could either reduce or eliminate coke consumption. Pulverized coal injection, replacing up to 40 percent of the coke needed in iron making, is widely used in Europe, Asia, and Japan and is now gaining favor in the United States. Natural gas injection is another alternative technology.

There are also two new steel-making technologies that could totally eliminate the need for coke. First, direct reduction, a coal or natural gas-based iron-making process, produces an iron substitute for scrap in electric arc furnaces. Another coke-eliminating option is the Corex process, which does not require coke and produces a large volume of waste heat that can be used to cogenerate electricity. Jewell is a nonpolluting coking technology used in steel making. Another coke steel-making process is Calderon whereby coal feeding and product recovery are employed in a closed process.

To evaluate the economic and environmental performance of these technologies, an engineering– economic model of steel production is used. The model incorporates environmental emissions coefficients from an LCA of steel production from primary resource extraction to the plant gate. The model selects the optimal combination of activities to minimize cost subject to a number of constraints, including mass and energy balances for intermediate products. Substitute activities represent new technologies available for possible adoption. The model is for a specific steel plant with coefficients based upon actual operating performance.

This analysis provides insights into the trade-offs between cost and environmental objectives, such as reducing greenhouse gas emissions, toxic discharges, and acidic residuals. The second application solves the model under two different definitions of cost: private and social cost, which includes private costs and those of the environmental damage, associated with LCA impacts. This approach permits determination of the socially optimal steel production technology mix achieved by internalizing environmental externalities. Following a sensitivity analysis, the third and final application examines the impact of carbon and virgin material taxes on technology choice in the steel industry.

The incremental private and social costs of steel design options are shown below. On the basis of the total quantity of emissions in mass units, scrap-based steel production is environmentally superior to conventional integrated steel production. Using an economic valuation of the life-cycle environmental

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

impacts, however, indicates that these two technology paths are quite similar. In fact, using conventional damage cost estimates, electric arc furnace steel production imposes slightly greater environmental damage than does integrated production due to substantially greater emissions of SO x and NO x resulting from the electricity generated to supply these facilities. Hence, adopting a life-cycle perspective for technology assessment can yield some rather surprising conclusions. If producers explicitly minimize social cost, however, scrap-based steel production with natural gas cogeneration of electricity is optimal. This finding suggests that electricity supply decisions are a critical element in assessing the economic and environmental performance of new steel production technologies.

Some of the Incremental Private and Social Costs of Steel Design Options

Jewell

Corex

Scrap Electric

Scrap-DR Electric

Calderon

New capital expenditures

216.6a

575.42

437.41

503.17

246,32

Labor and capital

32.40

90.78

–89.66

–72.73

0.48

Energy

–4.11

30.69

–20.07

–15.10

3.02

Materials

0.00

–35.89

85.57

63.34

0.82

Total operating cost

28.29

85.59

–24.19

–24.49

4.32

(less byproduct sales)

–0.16

–3.07

5.58

5.19

0.00

Net operating cost

28.13

82.51

–18.57

–19.30

4.32

Environmental damage

2.34

132.84

57.39

63.34

–16.58

Total social cost

30.63

218.43

33.23

38.85

–12.26

aMillions of 1998 dollars.

standards for emissions of air pollutants, such as sulfur dioxide, nitrous oxides, and particulate matter. Standards, however, are often considered an inefficient way to improve the environment because they stifle technological innovation.

There is also a need for integrating life-cycle assessment tools with operations-research-based decision science models so that cost, technical, and environmental performance can be optimized. For this to occur, however, environmental performance metrics must be specified and measured. Decision makers need a measure of the environmental bottom line, not an array of different environmental impacts that are difficult to value individually, much less collectively. Balancing various technical performance measures in design is a similar problem. In this case, establishing minimum acceptable standards helps simplify the decision problem in which the choice becomes a constraint or requirement. Unfortunately, the societal consensus reflected in environmental standards is often at odds with companies' attempts to maintain their fiduciary responsibility to stockholders for company profitability.

Another need for life-cycle assessment is to simplify and standardize its application. A modular approach could be one possibility to simplify and reduce LCA cost in which industry has off-the-shelf modules that provide LCA impacts for a material or transformation process under consideration.

Suggested Citation:"3 Tools for Virtual Design and Manufacturing." National Research Council. 2004. Retooling Manufacturing: Bridging Design, Materials, and Production. Washington, DC: The National Academies Press. doi: 10.17226/11049.

×

COMMON THEMES

Different disciplinary areas are directly involved in the design and manufacturing process—systems engineering, engineering design, materials science, manufacturing, and life-cycle assessment. Other supporting infrastructures are involved indirectly and affect all of these specific fields in an overarching way.

As outsourcing becomes more prevalent, and as many of these tasks are sent overseas, maintaining design and manufacturing capability in the United States is a real concern. It is essential that the United States continue to produce students who are trained for design, manufacturing, and systems engineering. It must also maintain a manufacturing capability in this country that employs these graduates.

Engineering Education

The availability of an educated domestic workforce is crucial to the quality of life, to the national defense, and to the economic security and competitiveness of the nation, and a key part of this workforce is in the manufacturing sector. The education and training of tomorrow's workforce become even more critical when one considers that the entire design and manufacturing field has expanded greatly in knowledge in recent years and will continue to do so, most likely at an even faster pace, in the foreseeable future.

Information technology is rapidly enhancing the process of communication between customers, engineers, and manufacturers. The broadening of the arena requires an integrated and well-balanced science and engineering education that covers systems, design, materials, and manufacturing. An integrated approach for traditional educational institutions as well as for certification programs for practitioners will ensure that the workforce is able to use the new tools and strategies for efficient product realization.

Design Tools Software Niebel Instructor Edition Cycle Time

Source: https://www.nap.edu/read/11049/chapter/5

Posted by: marshpabeggetur.blogspot.com

0 Response to "Design Tools Software Niebel Instructor Edition Cycle Time"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel