Many operators expect GPUs to be highly utilized, but examples of real-world deployments paint a different picture. Why are expensive compute resources being wasted — and what effect does this have on data center power consumption?
Max is a Research Analyst at Uptime Institute Intelligence. Mr Smolaks’ expertise spans digital infrastructure management software, power and cooling equipment, and regulations and standards. He has 10 years’ experience as a technology journalist, reporting on innovation in IT and data center infrastructure.
msmolaks@uptimeinstitute.com
Many operators expect GPUs to be highly utilized, but examples of real-world deployments paint a different picture. Why are expensive compute resources being wasted — and what effect does this have on data center power consumption?
AI vendors claim that “reasoning” can improve the accuracy and quality of the responses generated by LLMs, but this comes at a high cost. What does this mean for digital infrastructure?
AI is not a uniform workload — the infrastructure requirements for a particular model depend on a multitude of factors. Systems and silicon designers envision at least three approaches to developing and delivering AI.
The emergence of the Chinese DeepSeek LLM has raised many questions. In this analysis, Uptime Intelligence considers some of the implications for all those primarily concerned with the deployment of AI infrastructure.
The US government is applying a new set of rules to control the building of large AI clusters around the world. The application of these rules will be complex.
Uptime Intelligence surveys the data center industry landscape to look deeper at what can actually happen in 2025 and beyond based on the latest trends and developments. The stronghold that AI has on the industry is a constant discussion - but how…
Uptime Intelligence looks beyond the more obvious trends of 2025 and examines some of the latest developments and challenges shaping the data center industry.
Nvidia’s dominant position in the AI hardware market may be steering data center design in the wrong direction. This dominance will be harder to sustain as enterprises begin to understand AI and opt for cheaper, simpler hardware.
In this inaugural Uptime Intelligence client webinar, Uptime experts discuss and answer questions on cooling technologies and strategies to address AI workloads. Uptime Intelligence client webinars are only available for Uptime Intelligence…
Not all generative AI applications will require large and dense infrastructure footprints. This complicates AI power consumption projections and data center planning.
Historically, data center waste heat recovery has been promoted with a focus on the benefits for the heat off-taker. And yet, the overall winner in most situations is the data center operator — even if they are not paid for heat.
Generative AI models brought about an influx of high-density cabinets. There has been much focus on how to best manage thermal issues, but the weight of power distribution equipment is a potentially overlooked concern.
Most operators will be familiar with the outrageous power and cooling demands of hardware for generative AI. Why are these systems so difficult to accommodate, and what does this mean for the future of data center design?
The 14th edition of the Uptime Institute Global Data Center Survey highlights the experiences and strategies of data center owners and operators in the areas of resiliency, sustainability, efficiency, staffing, cloud and AI.
Several recent outages have exposed the global dependency on a small number of third-party suppliers — and governments around the world are already taking note.