Using Digital Twins to Increase Operational Output by 15%: Part One
How Concept Twins Can Drastically Improve Material Flow and Throughput
As factories and distribution centers are constructed or expanded, there is a continued burden to become more efficient and to do more with less. Increases in efficiency will keep costs low for consumers and businesses and enable the business to remain competitive in an increasingly aggressive global market.
Global digital twin technology adoption is anticipated to grow at an incredible rate in the next ten years. Our previous article on Virtual Commissioning tells the story of how digital twins can decrease commissioning time by up to 40%. The digital twin, which we define as the virtual representation of physical entities or systems across their lifecycle, can be used for the additional purposes of design validation and throughput analysis.
Early adopters have captured tangible value by using this technology in the early stages of expansion or new construction projects. At Kalypso, we call digital twins that are used early within the project lifecycle a concept or early design twin.
Undiscovered design issues can cause substantial problems downstream
Although virtual commissioning of engineering systems saves time and minimizes risk, commissioning is one of the last phases of an engineering project, whereas design decisions are made earlier in the process. Among these decisions, factory layout and equipment requirements are determined and shared with a network of partners including general contractors and equipment providers. These activities are typically executed during concept or early engineering phases. The following diagram represents the three generalized phases of developing industrial engineering systems, of which we will focus on the Planning & Design phase.
Conventional methods in designing, evaluating and finalizing material flow and layout rely on experience to make careful assumptions, then lean heavily on historical designs and cycles of human-led calculation. Engineering teams generate a small number of alternative layouts and use techniques like process of elimination to arrive at the final version. This approach has been effective, and a certain level of confidence can be garnered from this tried-and-true, linear process. However, there are notable challenges associated with these methods that decrease stakeholder confidence and increase project risk. They include:
- Difficulty in gauging concept feasibility, especially in unproven designs
- Minimal validation of process and material flow
Within manufacturing and warehouse settings, sub-optimal design decisions alone can directly impact all facets of operation. Due to the challenges of conventional methods, design decrements are often addressed after operation has begun.
As a result, project timelines are extended because factory acceptance can be associated with operational metrics such as production throughput. Alternatively, sub-optimal designs are improved over time and long after commissioning, leaving significant value on the table.
Some undesirable outcomes due to sub-optimal design include:
- Reduced production or material flow
- Increased inventory on the floor
- Excess people and material movement
- Lowered capital equipment utilization
- Late discovery of quality issues
- Increased clutter that creates safety risks for workers
- Overall sub-optimal operational throughput
Using simulation to optimize material flow and layout, increasing throughput by up to 15%
Operational simulation is a well-practiced method to experiment with different designs of factories and warehouses that organizations have used for decades, with early publications on the topic dating back to the 1960s.
Simulation offers an extremely fast and low-cost method of generating feedback for engineers. An engineering team can use this tool to understand to a high-level of confidence the flow of material and the impact of factors that either positively or negatively impact production throughput.
Some of these factors include cycle times, inventory requirements, machine utilization, material handling paths and operator utilization. Along with understanding the impact of various factors on operational performance, simulation also helps identify project risks, potential cost overruns and implementation challenges, allowing the team to preemptively determine a mitigation plan.
There are three main types of techniques used for operational simulation:
- Discrete Event Simulation (DES)
- System Dynamics Simulation (SD)
- Agent-Based Simulation (AB)
Each technique has its advantages and disadvantages, and out of the three, DES is the most popular technique for the modeling of manufacturing and warehouse operations.
One key reason is that DES is a form of stochastic simulation, which closely resembles the random nature of the real world. Although many DES software solutions have the capability to visualize an operation e.g., a factory, the source of the value lies underneath within the mathematical simulation.
Watch Our Arena Simulation Demo Below
Although popular, these well-established single simulation methods may not be sufficient to solve large-scale problems with increasing complexity. Each subsystem process can be modeled using the appropriate simulation approach; however, the challenge lies in defining the relationship linking these sub-models together. Recent literature shows promising frameworks to develop hybrid simulation modeling methods involving permutation and a combination of the above-mentioned techniques that relay information back and forth.
The benefits of a simulation, done early in the Concept or Engineering phases, are straightforward. It offers engineers a way to test layout designs and assumptions before the equipment is committed and fabricated, significantly reducing risk and cost of changes later in the design cycle.
Some typical business outcomes that are achieved include:
- Increased material flow and overall production by up to 15%
- Maximize equipment utilization to be greater than 85% and allow for adequate buffer capacity
- Minimized capital expenditure to achieve the desired output
- More effective placement of inventory and production equipment
- Less disruptive and safer travel paths for people and material-handling equipment (e.g., forklifts, AGV)
The most essential outcome is the increase in confidence that a design will achieve the desired business metrics, which are frequently measured by production output.
In summary, traditional discrete event simulation is a popular and powerful method of validating the designs of factories and distribution centers, with a focus on material flow and throughput. This type of mathematical and stochastic simulation provides engineering teams and stakeholders with higher levels of confidence, minimizes the need for costly design changes later in the project, and reduces project risk.
At this point, one may think if simulation is such a well-practiced technique, why are we now writing about it?
Relying on traditional simulation is insufficient in today’s data and software-driven world. Early design twins are the first step in leveraging digital twin technology. Combined with other tools, DES should be a critical part of all expansion and new construction projects. It not only improves performance but builds the foundation for higher fidelity simulations that are typically executed in later phases of the project.
Stay tuned for Part Two where we will dive deeper into why traditional simulation alone is insufficient.