Nvidia CEO Jensen Huang’s comment that liquid-cooled AI racks will need no chillers created some turbulence — however, the concept of a chiller-free data center is an old one and is unlikely to suit most operators.
Nvidia CEO Jensen Huang’s comment that liquid-cooled AI racks will need no chillers created some turbulence — however, the concept of a chiller-free data center is an old one and is unlikely to suit most operators.
Investment in large-scale AI has accelerated the development of electrical equipment, which creates opportunities for data center designers and operators to rethink power architectures.
Data from Uptime Intelligence’s giant data center analysis indicates that proposed power capacity and investment tied to giant data centers and campuses are at unprecedented levels.
DLC was developed to handle high heat loads from densified IT. True mainstream DLC adoption remains elusive; it still awaits design refinements to address outstanding operational issues for mission-critical applications.
DLC introduces challenges at all levels of data center commissioning. Some end users accept CDUs without factory witness testing — a significant departure from the conventional commissioning script
Uptime Intelligence’s predictions for 2025 are revisited and reassessed with the benefit of hindsight.
Meeting the stringent technical and commercial standards for UPS energy storage applications takes time and investment — during which Li-ion technology keeps evolving. With Natron gone, will ZincFive be able to take the opportunity?
A bout of consolidation and investment activity in cooling systems in the past 12 months reflects widespread expectation of a continued spending surge on data center infrastructure.
Performant cooling requires a full-system approach to eliminate thermal bottlenecks. Extreme silicon TDPs and highly efficient cooling do not have to be mutually exclusive if the data center and chip vendors work together.
Currently, the most straightforward way to support DLC loads in many data centers is to use existing air-cooling infrastructure combined with air-cooled CDUs.
Most operators do not trust AI-based systems to control equipment in the data center - this has implications for software products that are already available, as well as those in development.
In Northern Virginia and Ireland, simultaneous responses by data centers to fluctuations on the grid have come close to causing a blackout. Transmission system operators are responding with new requirements on large demand loads.
Against a backdrop of higher densities and the push toward liquid cooling, air remains the dominant choice for cooling IT hardware. As long as air cooling works, many see no reason to change - and more think it is viable at high densities.
Real-time computational fluid dynamics (CFD) analysis is gradually nearing reality, with GPUs now capable of producing high-fidelity simulations in under 10 minutes. However, many operators may be skeptical about why this is necessary.
Direct liquid cooling adoption remains slow, but rising rack densities and the cost of maintaining air cooling systems may drive change. Barriers to integration include a lack of industry standards and concerns about potential system failures.