Now, it is beginning to impact electronic product design, making it easier to visualise digital twins and create virtual product prototypes supported by real-time data collected by IoT devices.
A digital twin, the process of creating a digital replica of a real-world product or system and collecting and integrating real-time data sources into the virtual model, is not a new idea. The first digital twin was reportedly invented by NASA to rescue the Apollo 13 astronauts. However, their use has accelerated rapidly: the global digital twin market size is anticipated to reach over $63 billion, with a growth rate of 41.7 per cent CAGR between 2021 and 2027, according to Research and Markets.
The growth of digital twins is due to the massive benefits in time, cost and performance they provide, plus increased access to real-time data through IoT sensors. They differ from digital replicas because they are far more than just accurate representations: digital twins continually collect and process data from their real-world counterparts.
In practice, product designers can try out new ideas with minimal cost and risk, uncover problems earlier, gain insight into how various parts of the whole interact or how an item interacts with other parts of a system. Virtual testing of how a product performs in the real world, rather than the old laborious process of building new physical prototypes, can save development budget, improve iteration velocity, speed up time-to-market and improve product quality.
Digital twins can also be used for predictive maintenance, balancing corrective maintenance (fixing a part once it fails) and preventative maintenance (fixing a part before it fails). The billions of IoT sensors collect data and feed them directly into the digital twin. As a result, unusual behaviour or sudden surges in usage can be monitored, and any issues addressed before they impact production.
Digital twins by market
There are numerous examples in the field across a wide range of industries from automotive to manufacturing to aerospace, and there is a strong Digital Twins Consortium, a global ecosystem comprising industry, government, and academia. It has an impressively long list of members, including Dell, Lockheed Martin, and Microsoft.
Automotive companies use digital twins to build cost-effective prototypes. Static CAD models with limited dynamic views are being transformed with real-life rendering. Crash tests can be simulated more effectively on different terrains. Manufacturers use augmented reality (AR) and virtual reality (VR) for training and maintenance. Through AR headsets, technicians can view the latest models of the newest product specifications before their eyes while working.
Digital twins are being used to predict and improve the performance of aerospace components. Digital twins can even help consumers experience a product — for instance, a car, or a luxury jet — before committing to a purchase. These real-world or future examples are just the beginning of what is possible.
However, if there has been one fault with digital twins to date, it has been the difficulty of presenting information in a way that is context-rich and easily understandable visually. Until recently, much digital twin software has been highly technical, complex and engineering-focused. While these traditional tools have played a valuable role in the evolution of digital twins, they have had their limitations.
This is where game development engines come to the foreground, providing far more realistic visualisations and making digital twins accessible to more contributors. Popular examples include Unity or Unreal Engine, but many more have long been used by games studios to create compelling and complex 2D or 3D rendering.
Able to support real-time, interactive gameplay, games engines are the ideal foundation for photo-realistic, immersive simulations of the real world.
Imagine using the visual impact of the software used to create Grand Theft Auto or Call of Duty to support product development. High-quality animations provide teams with far more than just dry data: they make the experience more immersive and lifelike. This is something that some savvy manufacturers have already realised, with many more beginning to look at the benefits of games engines.
For example, according to the Perforce 2021 State of Automotive Software Report, 50 per cent of survey respondents said they are interested in using games engines in the near future. Games engines are reportedly used to train driver assistance systems with synthetic sensor data. Unreal Engine now includes digital twins for engineering, construction and architecture as one of its main solution focus areas.
Across various industries, we are hearing about games developers being recruited into teams to support digital twins, complementing the skills of existing employees.
In addition to game engines and IoT sensors, CAD or 3D modelling tools are also used to create these enhanced visual environments. Designers building digital twins create software pipelines into which CAD designs are exported and used as input data to the game engines. This pipeline is essential because integrating different software elements creates less risk of the same changes being made twice in two separate tools. So, if a change is made in a CAD design, it will be seamlessly reflected in the digital twin.
Another crucial element is version control, a type of software that has long been used in association with games engines, ensuring a ‘single source of truth’ of all assets — code, art, sound files and more — across the entire development pipeline, with a clear view of what has changed where, when, how and by who. In electronic product design, multiple versions of the same digital twin can all be stored and rolled back to an earlier stage of development if necessary. Version control also provides an audit trail of the development process, useful for compliance purposes.
There are some essential considerations for games engine-based digital twins to be successful. Common mistakes include improperly stored data, trying to achieve too much too quickly, insufficient testing, poor integration of the various tools involved, and lack of stakeholder buy-in.
It is vital to create a highly integrated and scalable software foundation that can accommodate large volumes of often highly complex data from different sources. Creating internal standards to ensure that all the elements can communicate will help prevent problems down the line. Ensuring a process for collaboration and dependency management across teams and data sets is critical.
Given that teams are often geographically dispersed and working remotely, that infrastructure also needs to support distributed development, giving individuals control over their own workspace while maintaining a centralised system and view of every iteration. Plus, those contributors may be using their own technology stack, so the version control system must be technology-agnostic.
Iterate versions and test often, with as much automation of processes that do not require manual intervention as possible. From a data security perspective, the digital twin environment should ensure that only people who need specific data have access. Hence, the storage or version control system needs to provide granular access across user, file type, system and network.
Two final cultural points: as is always the case with any emerging technology area, communicating objectives, setting expectations and getting stakeholder support early will make a huge difference to how well a project is received internally. So, if any roadblocks occur, senior management will be more motivated to seek a solution rather than abandon the idea. For the same reason, it makes sense to start small by picking a modest project with achievable goals and then build on that success.
Digital twins, together with game engines and IoT, will be part of the bedrock of Industry 4.0, but many organisations are still at the early stages of the discovery and learning process. However, investing time now to understand more about this powerful trio is at the very least an interesting experiment, and at most, a gateway to improved electronic development and product lifecycle management.
Author details: Brad Hart, Chief Technology Officer, Perforce Software