- By Subham Sett, Jing Bi
- Factory Automation
- Most digital tools address some of the critical additive production challenges but rely on others to complete the entire process—causing loss in production.
- An organization needs a model-based approach that integrates design, materials, manufacturing, and production to successfully evolve additive from the lab to a production environment.
- An unbiased public benchmark is crucial to building trust in the additive community.
Any machine, any process, any material
By Subham Sett and Jing Bi
Additive manufacturing (AM) has witnessed tremendous growth as its focus has shifted from prototypes to end-use functional parts. However, the industry has a set of critical production challenges, including build repeatability, process stability, yield rates (and, perhaps, failure for critical components), and the ability to deploy in-service. Digital tools are helping resolve some of these issues: generative design, functional lattices, build planning with hardware integration, thermal distortions, and shape compensation. Some of these tools are very specialized for certain tasks while relying on others to complete the entire additive process. As a result, an organization often relies on a disparate set of applications connected together in a file-based approach. Such a system can lead to lost productivity as users work through multiple software packages. This makes it a challenge for production processes where version control, traceability, and data accuracy are critical.
A model-based approach (as opposed to a file-based one) that integrates design, materials, manufacturing, and production is paramount to the successful evolution of additive from a lab environment to a production environment. This is being addressed by software platforms that enable end-to-end digitalization of additive manufacturing while connecting additive to the rest of an organization's industrial processes.
Physics-based simulations of the additive process are crucial in assessing the finished part's quality. Much of the attention has been on powder bed metal processes, as industries-primarily aerospace, defense, and medical-work to bring certified parts to market. Mostly based on finite element methods, these simulations either rely on precalibrated libraries (based on scanning strategies) or thermal strains that serve as inputs to relatively fast computations of the part distortions. These methods are fairly simple to use and do not require the user to dive deep into the physics of the solutions.
To be successful with this method, users need to build these calibrated strain libraries, and in most cases, they are specific to a machine type or even a specific machine. Although it is an elegant and simple approach (also used in the welding community), issues with this method will arise as machines are retired, new machines are introduced into production, and the volumes of data and designs keep increasing over time.
The other approach relies on a fully thermo-mechanical solution to the process simulations. Scanning strategies can be used in lumped thermal models to predict the thermal profile as the part is being built, layer by layer (or multiple layers together). The thermal profiles then drive the mechanical simulations for a more accurate prediction of the distortions. The main advantage of this method is that the fidelity of the simulation can be controlled. At the lower end, running very accurate simulations in the micro-second level (or lower) can capture the physics behind the manufacturing process down to melt-pool levels, phase change, solidification, and microstructure evolution. These simulations are run on representative cube models (at mm level) and help get to accurately predicting residual stresses, voids, cracks, and so on, factors that will affect the service life of the functional parts.
At the higher end of the scale, at the part level, a multiscale approach is used to map the lower-level scales to predict overall part distortions and stresses. The drawback of this method is that a fundamental understanding of the physics is required to create simulation models. Often, these models are part of the company intellectual competence and as such mature over time. However, as hardware vendors bring to market machines with new processes, faster build rates, materials choices, and open frameworks, what works for metal powder bed processes may not apply to them.
In a powder bed fabrication process, thermal energy selectively fuses regions of a powder bed; in a binder jetting process, a liquid bonding agent is deposited to join the material powder. In a direct energy deposition process, a nozzle that is mounted on a multi-axis arm deposits molten material, and in photo polymerization, liquid photopolymer is selectively cured by light-activated polymerization. While each process family uses a different raw material supply form (i.e., powder, wire feed, liquid resin, ink), each process family manufactures parts consisting of different material types. For example, powder bed fabrication produces metallic and plastic parts; binder jetting produces metallic, plastic, and ceramic parts; material extrusion produces plastic and composite.
Adding to the complexity is the fact that each process family includes many subprocess types that are differentiated by technical details and patents, such as close or open system, input/output formats, how raw material is included, how raw material is selectively heated, different types and sequences of heating and cooling sources, and how machine manufacture and environmental conditions are controlled. Under powder bed fabrication alone, there are a number of subprocess types, e.g., selective laser sintering (SLS), selective laser melting (SLM), electron beam melting (EBM), and direct metal laser sintering (DMLS). Under directed energy deposition, there are laser cladding, direct energy deposition (DED), laser metal deposition (LMD), laser engineered net shape (LENS), and laser or electron beam wire deposition.
Furthermore, with many of these burgeoning processes, it is premature to predict if a single process or a multitude of processes will be in a company's tool kit. All indications point to the latter, where companies have a variety of machines at their disposal as they plan their additive road map. Herein lies the issue that could inhibit a true end-to-end digital thread.
Do I have to acquire another specialized simulation software, develop another set of techniques, maintain libraries for individual parts and machines? What if the process parameters change? What if a set tool path for polymer extrusion does not get me the right part strength? Do I go back to the drawing board? One way to address this concern is to be rigorous in the management of the AM simulation chain: machines, processes, and materials. If not planned with care, chances for drowning in the data lake are high.
There is a better way to address this issue, though, by providing researchers and analysts with a general-purpose simulation framework designed from the ground up to handle any machine, any process, and any material. In other words, a framework agnostic of the process, but driven at a deeper level by the science of energy and material handling. Energy decides how the material evolves during the process. For metal powder, the laser sources fuse the powder, and as it solidifies, the as-built material properties decide the part strength and quality. For polymer extrusion, the energy source to fuse the pellets is the extrusion process. Each process inputs this energy as a series of events that are distributed in space and time, predetermined as part of the manufacturing process.
The other issue is material handling. While material handling is machine specific, the simulation framework only needs to know when and how the material is handled. Again, these can be treated as distributed events in space and time. For powder bed, it is the roller laying down a new layer of powder. For multi-jet, it is the ink that gets deposited before the lamp provides the fusion energy. As before with energy, these events are determined a priori by the planning algorithms (figure 2).
Taking these independent events as inputs and automatically solving for dependent events, such as powder melting, liquid metal solidification, cooling surface evolution, temperature history, build stresses, and distortions, the simulation framework can proceed with the computations and predict the outcome of a certain manufacturing process or material. We also need to mesh the part and the underlying support structures in a regular finite element sense-both at a geometry and voxel level-while intersecting it automatically with collected events (energy and material).
This technique makes the simulations more efficient, as it can handle multiple layers of manufacturing events while still using a regular mesh. This open simulation framework not only addresses the multitude of existing processes but can help accelerate the maturation of new processes: a new thermal mechanical physical process, a new chemical agent application method, a new UV light polymerization process, or even a new machine that uses a number of different heating devices simultaneously or sequentially to preheat and fuse the parts!
Making this simulation framework integral to the model-based digital thread allows for seamless deployment to the end users within an organization. A part designer can quickly evaluate part distortions and compensate for the shape (with the tedium of file export and import), so that the final part is within manufacturing tolerance.
The machine operator and build planner can then determine optimal part orientations or quickly verify the entire build plate by running the fast simulations, already preconfigured by the experts. They can change support strategies, and scan strategies and the simulation models are updated automatically, with a full history of updates stored-so they can revert to old scenarios with ease.
The analyst, who is responsible for signing off on the part's functional quality, can do so by incorporating the additively manufactured part-let's say a bracket-in her product configuration and running it across multiple loads and fatigue scenarios, since the residual stresses and material behaviors are inherently part of the part model she is provided with. If yields are being affected, the researcher can diagnose the issues off of the same digital thread, building highly detailed models to investigate defects, porosities, or crack propagation. All of the above feed into a data lake that continually enhances an organization's intellectual competence. It decreases its time to market while learning from past experiences to accelerate future production targets.
A final thought on simulations: it is imperative that simulations be validated with reality. As control settings for tests lead to variances in experimental data, the same applies to simulations. Modeling assumptions, physics approximations, and boundary settings all impact the simulation outcome, so verification of what you do is key. Furthermore, establishing wide, unbiased benchmarks is critical. The entire simulation community should come together and validate their methods against a common set of tests. The AM-Bench from the National Institute of Standards and Technology is an excellent step in that direction, as it will build trust in the additive manufacturing community of the role simulation has in helping them move additive from a niche manufacturing technique to a mainstream one (figure 4).
For more information on additive manufacturing, visit http://go.3ds.com/print2perform.
We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.