The past year warrants a revision of generative AI power estimates, as GPU shipments have skyrocketed, despite some offsetting factors. However, uncertainty remains high, with no clarity on the viability of these spending levels.
Today, GPU designers pursue outright performance over power efficiency. This is a challenge for inference workloads that prize efficient token generation. GPU power management features can help, but require more attention.
This briefing report identifies and describes several de facto standards and laws used in the field of data center sustainability and efficiency (for convenience, we use the term “standards” for all).
Cloud AI needs cost discipline now
Data center sustainability standards grow globally
Error-proof emergency communications for facility teams
REPLAY | Inside the Uptime Network: Exclusive Insights and What's Next for Data Center Leaders
REPLAY | Annual Data Center Outage Analysis 2025
REPLAY | European Cybersecurity Regulation and its Impact on Digital Infrastructures
Cooling Options in a Data Center White Spaces
CFM/kVA Question
Benchmarking - last call for EUL, ramping up "next'
Water is local: generalities do not apply
Density choices for AI training are increasingly complex
AI load and chiller systems: key considerations
Are data centers to blame for power quality issues?
Small modular reactors: building critical mass
The DeepSeek paradox: more efficiency, more infrastructure?
The two sides of a sustainability strategy
Calculating work capacity for server and storage products
OPINION | Data centers weather grid failures — but utilities want change
Cloud: when high availability hurts sustainability
Annual outage analysis 2025
Data center AI strategies are mixed in early 2025
Uncertainty and doubt as US changes GPU export rules again
In the US, data center pushback is all about power
GPU utilization is a confusing metric