filters
Date
all topics
  • AI Workloads
  • Cooling Systems
  • Power Systems
  • IT Systems & Networks
    • IT Systems
    • Networks & Connectivity
    • Cloud Computing
  • Mechanical & Electrical
  • Sustainability
    • Energy Efficiency
    • Reporting & Regulations
    • eWaste
    • Water
    • Low-Carbon Power
  • Resiliency, Risk & Security
    • Availability
    • Risk Management
    • Fire & Life Safety
    • Physical Security
    • Cyber Security
  • Data Center Management
    • DCIM
    • Building & Controls
    • AI & Automation
    • Procedures & Documentation
  • Staff, Skills & Training
    • Training & Professional Development
    • Staffing & Hiring
  • Strategy & Planning
    • Benchmarks & Industry Trends
    • Standards & Metrics
    • Economics
  • Design, Siting, Construction
    • Building Features & Design
    • Site Selection
    • Modernization & Consolidation
    • Prefab & Modular Data Centers
  • Data Center Types & Venues
    • Colocation
    • Enterprise
    • Hyperscale
    • Edge
    • Venue Selection
  • Innovation & Disruption
all types
  • Intelligence
  • Keynote Reports
  • Briefing Reports
  • Data Reports
  • Data Assets
  • Intelligence Updates
  • Intelligence Resources
  • Event Recaps
  • On-Demand Webinars
  • Upcoming Events
  • Discussions
Selected Filters
Unlocked content only
Investments back two-phase cooling as water cold plate successor

While water cold plates continue to dominate current liquid-cooling adoption, the industry has also turned its attention to different approaches, with two-phase cold plates in particular becoming a promising alternative.

10 Apr 2026
How AI training choices affect infrastructure costs

Choosing whether to train a model from scratch or fine-tune an existing one comes down to the use case and cost — with hardware utilization remaining an important cost factor.

23 Mar 2026
Enterprises will deploy inference in-house — if they can

Enterprises deploying AI inference need to choose carefully to limit costs and protect their data.

12 Mar 2026
Where to deploy AI inference: a guide to the economics

Although cloud platforms often offer the lowest cost for AI inference, on-premises deployment may be preferable due to application architecture, data locality and control requirements.

11 Mar 2026
Interactive AI inference costing tool

The cost of AI inference varies widely depending on deployment model, utilization and hardware. This costing tool compares on-premises, colocation and managed AI platforms on a like-for-like basis.

Feb. 2026 AI Infrastructure Survey [Results and Crosstab files]

Results from Uptime Institute's 2026 AI Infrastructure Survey (n=1,141) focus on the data center infrastructure currently used or being planned to use to host AI Training and AI Inference, as well as future industry outlooks on the usage of AI. The…

6 Feb 2026
Rising cost of traditional IT: temporary spike or long-term shift?

The shortage of DRAM and NAND chips caused by demands of AI data centers is likely to last into 2027, making every server more expensive.

3 Feb 2026
Next-gen GPUs may not need chillers — but data centers do

Nvidia CEO Jensen Huang's comment that liquid-cooled AI racks will need no chillers created some turbulence — however, the concept of a chiller-free data center is an old one and is unlikely to suit most operators.

8 Jan 2026
Supply chain exploits: the blind spots operators need to address

Cybercriminals increasingly target supply chains as entry points for coordinated attacks; however, many vulnerabilities have been overlooked by operators and persist, despite their growing risk and severity.

9 Dec 2025
Validating the use of high-density DLC

Data4 needed to test how to build and commission liquid-cooled high-capacity racks before offering them to customers. The operator used a proof-of-concept test to develop an industrialized version, which is now in commercial operation.

17 Oct 2025
Liquid-to-air eases DLC rollout, but mind the setpoints

Currently, the most straightforward way to support DLC loads in many data centers is to use existing air-cooling infrastructure combined with air-cooled CDUs.

10 Oct 2025
AI's growth calls for useful IT efficiency metrics

Large-scale AI training is an application of supercomputing. Supercomputing experts at the Yotta 2025 conference agree that operators need to optimize AI training efficiency and develop metrics to account for utilized power.

Neoclouds: AI's shock absorbers

By raising debt, building data centers and using colos, neoclouds shield hyperscalers from the financial and technological shocks of the AI boom. They share in the upside if demand grows, but are burdened with stranded assets if it stalls.

24 Sep 2025
Intel makes major play in server efficiency

The data center industry will benefit from the race between Intel and AMD for technical supremacy, but the outlook in terms of power efficiency remains challenging.

AI-generated operating procedures carry a safety risk

Many operators report that they trust AI to draft their MOPs, EOPs and SOPs. But this potentially error-prone approach demands meticulous review by an appropriate member of staff, or operators risk increasing the likelihood of costly downtime.

Posting discussions is not available for Network Guests