Against a backdrop of higher densities and the push toward liquid cooling, air remains the dominant choice for cooling IT hardware. As long as air cooling works, many see no reason to change — and more think it is viable at high densities.
Against a backdrop of higher densities and the push toward liquid cooling, air remains the dominant choice for cooling IT hardware. As long as air cooling works, many see no reason to change — and more think it is viable at high densities.
Real-time computational fluid dynamics (CFD) analysis is gradually nearing reality, with GPUs now capable of producing high-fidelity simulations in under 10 minutes. However, many operators may be skeptical about why this is necessary.
Direct liquid cooling adoption remains slow, but rising rack densities and the cost of maintaining air cooling systems may drive change. Barriers to integration include a lack of industry standards and concerns about potential system failures.
The 15th edition of the Uptime Institute Global Data Center Survey highlights the experiences and strategies of data center owners and operators in the areas of resiliency, sustainability, efficiency, staffing, cloud and AI.
Liquid cooling contained within the server chassis lets operators cool high-density hardware without modifying existing infrastructure. However, this type of cooling has limitations in terms of performance and energy efficiency.
Results from Uptime Institute's 2025 Cooling Systems Survey (n=1,033) focus on the usage of data center cooling systems across the industry, zeroing in on the continued adoption of direct liquid cooling.The attached data files below provide full…
Direct liquid cooling challenges the common “line of demarcation” for responsibilities between facilities and IT teams. Operators lack a consensus on a single replacement model—and this fragmentation may persist for several years.
To meet the demands of unprecedented rack power densities, driven by AI workloads, data center cooling systems need to evolve and accommodate a growing mix of air and liquid cooling technologies.
While AI infrastructure build-out may focus on performance today, over time data center operators will need to address efficiency and sustainability concerns.
How far can we go with air? Uptime experts discuss and answer questions on cooling strategies and debate the challenges and trade-offs with efficiency and costs.Please watch this latest entry in the Uptime Intelligence Client Webinar series. The…
Critics argue that data center water use is excessive and poorly managed. Operators should select a cooling system to fit the local climate and available water supply, explaining water use within the context of local conditions.
High-end AI systems receive the bulk of the industry’s attention, but organizations looking for the best training infrastructure implementation have choices. Getting it right, however, may take a concerted effort.
Compared with most traditional data centers, those hosting large AI training workloads require increased attention to dynamic thermal management, including capabilities to handle sudden and substantial load variations effectively.
The emergence of the Chinese DeepSeek LLM has raised many questions. In this analysis, Uptime Intelligence considers some of the implications for all those primarily concerned with the deployment of AI infrastructure.
AI infrastructure increases rack power, requiring operators to upgrade IT cooling. While some (typically with rack power up to 50 kW) rely on close-coupled air cooling, others with more demanding AI workloads are adopting hybrid air and DLC.