Frazer-Nash's Mark Stevens considers the hard questions that the AMRC's new report, 'Untangling the Requirements of a Digital Twin', aims to answer...
I've seen the term 'digital twin' touted in various guises for a number of years and it has certainly gained more prominence recently. It's fair to say at times it feels overused, misused and even abused. But with the challenges of Net Zero and the growth of industrial digitisation (Industry 4.0), the concept is more relevant and necessary than ever. They can require relatively high time and investment to develop, but when implemented correctly, the return on investment and benefits can be huge.
The Advanced Manufacturing Research Centre (AMRC) is working hard to support UK industry in navigating its way through answering hard questions about digital twins. It has recently launched its report 'Untangling the Requirements of a Digital Twin'. I particularly like its definition of a digital twin: a live digital coupling of the state of a physical asset or process to a virtual representation with a functional output. It reflects the wide breadth of uses to which they can be applied, using a combination of data, statistics and models to reliably predict future scenarios. These enable more effective strategic planning, asset management and assurance.
AMRC has kindly included one of our case studies in its report. It provides reliable, unit-specific maintenance schedules for our customer’s large fleet of high-value industrial gas turbines. At first glance, this seems a daunting prospect: a gas turbine is complex and the failure mechanisms are non-linear. However, we aim to simplify the problem. For example, it's unnecessary to consider the whole asset since we only need to worry about the parts that are likely to fail first. We exploit the accuracy of complex multi-physics models, validated by tests, to understand how measured data relates to the condition at the chosen locations. However, we don't use these complex models in our digital twin. We simplify the representation further, using reduced order models. These are easy to implement and quick to run. We use statistics to underpin the physics, allowing us to quantify the uncertainty in the output. A digital platform collects the data from each unit and, via the reduced order models, predicts their condition in near real time, presenting it to the operator centrally. This enables them to identify when each unit requires maintenance or replacement, allowing high value assets to be operated closer to their limits and reducing unnecessary interventions.
The above is just one example. Whether the term 'digital twin' sticks is debatable, but I think the underlying approaches will only continue to grow. They will enable new technologies being developed to meet social/political needs to be designed and operated more efficiently, making them a more attractive proposition to investors, policy makers and consumers alike.