UII UPDATE 359 | APRIL 2024
Digital twin software: Part 1
Uptime Intelligence has been observing digital twin (DT) capabilities in data center management and control software (DCM-C) for some time. While the DT concept is not new, recent advances in precision physics-based models, interactive simulations, and AI/machine learning (ML) have made DTs a more viable consideration for many operators.
Meanwhile, adapting or building new facilities for high density IT and AI infrastructure all add to management complexities. Many operators lack the experience to design and commission high density facilities and are uncertain about future infrastructure requirements (see The DeepSeek paradox: more efficiency, more infrastructure?).
For those planning new AI data centers or upgrading existing facilities, DTs offer the potential to design infrastructures in a virtualized sandbox environment. Performing tests, generating insights and making informed decisions can all occur before committing to significant capital infrastructure investment. Since December 2024, examples include:
This is the first of two reports on DTs and will identify key attributes of DTs for data center applications and outline the opportunities and challenges for operators. The second report will explore DT software product maturity.
A DT is a software system that utilizes component libraries and precision sensor data to create digital replicas of physical assets in the data center. DTs employ simulations and models to test operating scenarios, make predictions and provide recommendations based on the virtual environment. They can also be used to discover hidden faults – any discrepancies between the virtual model and the real-world facility would indicate a problem with equipment, sensors or data.
The terms visualizations and simulations are often used interchangeably. However, while both rely on sensor data, they differ on objectives and capabilities.
Visualizations involve digitizing, modeling and configuring an asset or a collection of assets for monitoring, identification and capacity planning purposes. Data center infrastructure management (DCIM) software products often visualize assets, such as facilities equipment, IT servers, racks and network ports.
Simulations often use asset visualizations and their source data, and then add other application and environmental data inputs to model the impact of change under different operating conditions.
Key attributes of a DT include:
Virtualizing a physical environment for testing purposes is a key benefit of a DT. Predicting the impact of changes away from a live operational environment helps mitigate risks associated with equipment retrofits and new facility designs. DTs targeting the following outcomes will likely be the most effective.
Despite DT’s potential in designing and managing complex facilities, such as those for AI and high-density IT, many enterprises and operators still need to modernize and improve their systems and processes. Addressing issues around data quality, provisioning, interoperability and cybersecurity should be immediate priorities.
Operators considering DTs will likely need to invest significantly in data quality improvements, processes and system interoperability. Trust in the data (and, by extension, the DT outputs) is critical to gaining corporate buy-in and adoption.
As with other types of data center management software, operators will likely be averse to sharing operational data with cloud-based DTs due to perceived security and confidentiality issues. However, cloud-based DT platforms, such as Microsoft’s Azure Digital Twins or Nvidia Omniverse will likely be of significant interest to those designing new AI and hyperscale cloud data center facilities. Data derived from these DTs will also benefit future applications.
DT value creation will depend on access to rich datasets inside and outside the data center. Restricting connectivity to internal IT and OT systems may alleviate specific external network and software security concerns. However, any restrictions will limit the value of DTs relying on shared data.
Open data and simplified integrations present security risks if data center software and systems are unpatched, unsupported or lack adequate authentication and access control. Data center cybersecurity remains a challenge for many operators. Since DTs rely on sharing data, a lack of cybersecurity credentials could leave DTs vulnerable to exploits seeking to exfiltrate sensitive operational data.
DT simulations hold promise to transform outdated operational design and planning practices that often involve manual work and bad quality data. Moreover, the ability to accurately simulate changes and identify risks and opportunities in areas such as AI and high-density IT infrastructure could have significant benefits for operators.
DTs depend on turning high quality data inputs, into accurate simulations and predictions. The reliance on sensor data and often outdated processes, however, means that operators need to prioritize data quality, system interoperability and cybersecurity to build and maintain trust.
Other related reports published by Uptime Institute include:
Pulling IT power data with software
DCIM past and present: what’s changed?
Using optimization software for cooling and capacity gains
Data center management and control software: an overview