Why Composability Is Essential for Scaling of Digital Twins

A composable digital twin draws from six clusters of capabilities that help manage the integrated model and other digital twin instances based on the model.

Digital twins allow businesses to model buildings, goods, production lines, facilities, and procedures. This can enhance performance, rapidly identify quality issues, and facilitate more informed decision-making.

Today, the majority of digital twin initiatives are unique endeavors. A team may construct one digital twin for a new gearbox and begin from scratch when modeling a wind turbine with this component or the business process that repairs this component.

Engineers would ideally be able to rapidly construct increasingly complicated digital twins to represent turbines, wind farms, power grids, and energy companies. This is complicated by the several digital twin components beyond physical models, including data management, semantic labels, security, and user interface (UI).

New methods for assembling digital elements into larger assemblies and models could facilitate the simplification of this procedure.

A new kind of model

Significant similarities and variations exist between the models used to construct digital twins and those used for analytics and artificial intelligence (AI). All of these initiatives begin with the collection of relevant and timely historical data to drive the model’s design and calibrate the present state with model outcomes.

In contrast to conventional statistical learning methods, the model structures in digital twin simulations are not directly derived from the data. Instead, modelers surface a model structure through interviews, research, and design sessions with domain experts to align with the upfront-defined strategic or operational issues.

Also Read: Three Must-Do’s for CIOs When Agile Meets Hybrid Work

Therefore, domain experts must be involved in verifying and guiding the model structure. This time commitment can restrict simulations to applications that require ongoing scenario analysis. Creating a digital twin model is a continuous process. Model granularity and system bounds must be carefully evaluated and established to balance time investment and model suitability for the topics they are designed to address. ROI will be exceedingly challenging to attain if firms are unable to limit the details that a simulation model captures appropriately.

Composable digital twins

Here, the CPT framework comes into play. CPT provides diverse teams with a standardized method for collaborating earlier in the development cycle. A reference framework for thinking about six competency areas, including data services, integration, intelligence, UX, management, and trustworthiness, is a crucial component.

This can assist businesses in identifying composability issues that require internal or external solutions. The framework also assists in identifying specific integrations at the level of capabilities. Consequently, firms can consider constructing a portfolio of repeatable competencies. This eliminates redundant services and efforts.

Also Read: Three Strategies for Proactively Managing Hybrid Teams

Packaging capabilities

On the other hand, a composable digital twin comprises six clusters of capabilities that aid in managing the integrated model and other digital twin instances based on the model. In addition, it can combine IoT and other data services to offer an up-to-date depiction of the entity represented by the digital twin. The CPT displays these many capabilities as a periodic table, making it independent of any specific technology or design. Describing the digital twin in terms of its capabilities facilitates pairing a particular implementation with the technologies that provide the requisite capability. This parallels the industry-wide trend toward modular business applications. This method enables engineers, scientists, and other subject-matter experts to compose and recompose digital twins for varying business requirements.

In addition, it presents opportunities for new packaged business skills that can be applied across industries.

Composability difficulties

Currently, many digital twin projects are in the pilot stage or limited to a single asset or process. While digital twins provide enormous potential for operational efficiency and cost savings, composability difficulties are the primary cause of their slow scaled adoption. Engineers have difficulty integrating the various methods through which equipment and sensors collect, process, and format data. This complexity is exacerbated by the absence of common standards and reference frameworks to facilitate simple data interchange.

Companies attempting to grow digital twins confront a number of significant obstacles, such as a nascent data landscape, system complexity, talent availability, and limited verticalization in off-the-shelf platforms and solutions.

Also Read: Role of Poor Digital Experience in the Great Resignation

Threading the pieces together

The next step is to construct a second-layer composability framework with more granular definitions of capabilities. A related attempt at a “digital-twin-capabilities-as-a-service” model will outline how digital twin capabilities could be specified and made available in a zero-touch manner through a capabilities marketplace.

Eventually, these efforts could also provide the groundwork for digital threads that facilitate the connection of processes spanning different digital twins.

In the near future, a digital thread-centric approach will take center stage in order to facilitate integration at both the data platform silo and organizational levels. DataOps-as-a-service for data transformation, harmonization, and integration across platforms will be crucial for digital twin initiatives to be modular and scalable.