The 14th edition of the Uptime Institute Global Data Center Survey highlights the experiences and strategies of vendors, product suppliers, engineering firms, and consultants in the areas of data center spending, customer challenges, supply chains,…
As AI supercharges the growth in data center energy demands, new developments are likely to be increasingly politicized. Central governments may support their expansion, but opposition from local authorities and environmentalists will grow.
Supersized generative AI models are placing onerous demands on both IT and facilities infrastructure. The challenge for next-generation AI infrastructure will be power, forcing operators to explore new electrification architectures.
Power grids are under stress, struggling to meet future demand and increasingly prone to outages. More utilities will expect data centers to contribute power — and be more flexible in their use of power.
Cloud providers need to win AI use cases in their early stages of development. If they fail to attract customers, their AI applications may be locked-in to rival platforms and harder to move, which can have serious repercussions.
Nvidia’s dominant position in the AI hardware market may be steering data center design in the wrong direction. This dominance will be harder to sustain as enterprises begin to understand AI and opt for cheaper, simpler hardware.
As operators expand their use of hybrid IT and cloud, optimizing the IT could help alleviate concerns over availability and efficiency. This report is part two of a four-part series on data center management software.
In this inaugural Uptime Intelligence client webinar, Uptime experts discuss and answer questions on cooling technologies and strategies to address AI workloads. Uptime Intelligence client webinars are only available for Uptime Intelligence…
Visibility into costs remains a top priority for enterprises that are consuming cloud services. Improving the tagging of workloads and resources may help them to spot, and curb, rising costs.
Cyber strategies need to extend beyond the facility to reduce third-party supplier threat risks. Data center executives should apply robust, consistent supply chain risk management practices to critical data center technologies
The cost and complexity of deploying large-scale GPU clusters for generative AI training will drive many enterprises to the public cloud. Most enterprises will use pre-trained foundation models, to reduce computational overheads.
Generative AI is not only accelerating the adoption of liquid cooling but also its technical evolution. Partly due to runaway silicon thermal power levels, this has led to a convergence in technical development across vendors.
Hydrogen from renewable sources is in short supply. While future plentiful supplies are planned, currently only a very small number of data centers are using hydrogen for standby power.
Not all generative AI applications will require large and dense infrastructure footprints. This complicates AI power consumption projections and data center planning.
Enterprises have much enthusiasm for AI, interviews and workshops by Uptime Intelligence suggest, but this is tempered by caution. Most hope to avoid disruptive, expensive or careless investments.