In this session, we explored how these principles of data center operations were applied to support, sustain, and enhance the skills of people supporting the infrastructure.
In this session, we explored how these principles of data center operations were applied to support, sustain, and enhance the skills of people supporting the infrastructure.
Cybersecurity has traditionally not been a key focus of attention for data center operators. But cyber incidents are on the rise and concerns are growing. Unaddressed vulnerabilities leave operators at increasing risk from evolving threats.
Although the share of processing handled by the corporate or enterprise sector has declined over the years, it has never disappeared. But there are signs that it may reclaim a more central role.
For the past 15 years, the case for moving workloads out of enterprise data centers and into the cloud and colocation has been strong. Power availability and demand for high-density capacity may change that.
Human error is an increasingly exploited weakness by cyberattackers, leading to data center security breaches and greater risk for enterprises and operators.
To meet the demands of unprecedented rack power densities, driven by AI workloads, data center cooling systems need to evolve and accommodate a growing mix of air and liquid cooling technologies.
Today, GPU designers pursue outright performance over power efficiency. This is a challenge for inference workloads that prize efficient token generation. GPU power management features can help, but require more attention.
The 15th edition of the Uptime Institute Global Data Center Survey highlights the experiences and strategies of data center owners and operators in the areas of resiliency, sustainability, efficiency, staffing, cloud and AI. The attached data files…
The past year warrants a revision of generative AI power estimates, as GPU shipments have skyrocketed, despite some offsetting factors. However, uncertainty remains high, with no clarity on the viability of these spending levels.
As AI workloads surge, managing cloud costs is becoming more vital than ever, requiring organizations to balance scalability with cost control. This is crucial to prevent runaway spend and ensure AI projects remain viable and profitable.
We hosted an exclusive briefing on the Uptime Institute Network — the global community of data center leaders dedicated to improving operational resilience, efficiency, and strategic planning.
The trend towards regulating and controlling data center energy use, efficiency and sustainability continues to grow globally, with the appearance of utility rate management regulations and the propagation policies influenced by the EU’s EED.
This briefing report identifies and describes several de facto standards and laws used in the field of data center sustainability and efficiency (for convenience, we use the term “standards” for all).
Tensions between team members of different ranks or departments can inhibit effective communication in a data center, putting uptime at risk. This can be avoided by adopting proven communication protocols from other mission-critical industries.
Uptime Institute believes that data center operators should optimize facility-level sustainability performance before addressing ecosystem issues. A clear definition of data center sustainability is needed to enable this approach.