Against a backdrop of higher densities and the push toward liquid cooling, air remains the dominant choice for cooling IT hardware. As long as air cooling works, many see no reason to change — and more think it is viable at high densities.
Against a backdrop of higher densities and the push toward liquid cooling, air remains the dominant choice for cooling IT hardware. As long as air cooling works, many see no reason to change — and more think it is viable at high densities.
Several operators originally established to mine cryptocurrencies are now building hyperscale data centers for AI. How did this change happen?
The data center industry is on the cusp of the hyperscale AI supercomputing era, where systems will be more powerful and denser than the cutting-edge exascale systems of today. But will this transformation really materialize?
Serverless container services enable rapid scalability, which is ideal for AI inference. However, inconsistent and opaque pricing metrics hinder comparisons. This report uses machine learning to derive clear guidance by means of decision trees.
Data center operators are increasingly aware that their operational technology systems are vulnerable to cyberattacks. Recent incident reports show a rise in ransomware attacks, which pose significant risks to data centers
Underground hot rocks are emerging as a source of firm, low-carbon power for data centers, with new techniques expanding viable locations. Compared with nuclear, geothermal may be better positioned to support planned data center growth.
To meet the demands of unprecedented rack power densities, driven by AI workloads, data center cooling systems need to evolve and accommodate a growing mix of air and liquid cooling technologies.
As AI workloads surge, managing cloud costs is becoming more vital than ever, requiring organizations to balance scalability with cost control. This is crucial to prevent runaway spend and ensure AI projects remain viable and profitable.
Digital twins are increasingly valued in complex data center applications, such as designing and managing facilities for AI infrastructure. Digitally testing and simulating scenarios can reduce risk and cost, but many challenges remain.
While AI infrastructure build-out may focus on performance today, over time data center operators will need to address efficiency and sustainability concerns.
AI vendors claim that “reasoning” can improve the accuracy and quality of the responses generated by LLMs, but this comes at a high cost. What does this mean for digital infrastructure?
Data center builders who need power must navigate changing rules, unpredictable demands — and be prepared to trade.
Quantum computing progress is slow; press releases often fail to convey the work required to make practical quantum computers a reality. Data center operators do not need to worry about quantum computing right now.
Chinese large language model DeepSeek has shown that state of the art generative AI capability may be possible at a fraction of the cost previously thought.
AI is not a uniform workload — the infrastructure requirements for a particular model depend on a multitude of factors. Systems and silicon designers envision at least three approaches to developing and delivering AI.